RSS

Tag Archives: commodore

The 68000 Wars, Part 1: Lorraine

This is what a revolutionary technology looks like. In very early 1986 Tim Jenison, founder of NewTek, began distributing these full-color digitized photographs, the first of their kind ever to be seen on a PC screen, to Amiga software exchanges. The age of multimedia computing had arrived.

This is what a revolutionary technology looks like. In very early 1986 Tim Jenison, founder of NewTek, began distributing these full-color digitized photographs, the first of their kind ever to be seen on a PC screen, to Amiga public-domain software exchanges. The age of multimedia computing had arrived.

The Amiga was the damnedest computer. A riddle wrapped in a mystery inside an enigma, then all crammed into a plastic case; that was the Amiga. I wrote a book about the thing, and I’m still not sure I can make sense of all of its complications and contradictions.

The Amiga was a great computer when it made its debut in 1985, better by far than anything else on the market. At its heart was the wonderchip of the era, the Motorola 68000, the same CPU found in the Apple Macintosh and the Atari ST. But what made the Amiga special was the stuff found around the 68000: three custom chips with the unforgettable names of Paula, Denise, and Agnus. Together they gave the Amiga the best graphics and sound in the industry by a veritable order of magnitude. And by relieving the 68000 of a huge chunk of the burden for generating graphics and sound as well as performing many other tasks, such as disk access, they let the Amiga dazzle while also running rings around the competition in real-world performance by virtually any test you cared to name. It all added up not just to incremental improvement but rather to that rarest thing in any field of endeavor: a generational leap.

Guru Meditation

The Amiga, especially in its original 1985 incarnation, was a terrible computer. The operating system that shipped with it was painfully buggy. If you could manage to use the machine for just an hour or two without it inexplicably running out of memory and crashing you were doing well. Other glitches were bizarrely entertaining if they didn’t happen to you personally, such as the mysterious “date virus” that could start to spread through all your disks, setting the timestamp on every file to sometime in the year 65,000 and slowing the system to a crawl. (No, this “virus” wasn’t actual malware, just a weird bug.) Of course, software could be and to a large extent eventually was fixed. Other problems were more intractable. There was, for instance, the machine’s use of interlaced video for its higher resolution modes, which caused those marvelous graphics to flicker horribly in most color combinations. Baffled users who felt like their swollen eyeballs were about to pop right out of their heads after a few hours of trying to work like this could expect to be greeted with a lot of technical explanations of why it was happening and suggestions for changing their onscreen color palettes to try to minimize it. Certainly anyone who picked up an Amiga expecting an experience similar to the famously easy-to-use Macintosh was in for a disappointment. Despite the Amiga’s sporting a superficially similar mouse-and-windows interface, users hoping to get serious work or play done on the Amiga would need to educate themselves on such technical minutiae as the difference between “chip” and “fast” memory and learn what a program’s “stack” was and how to set it manually. Even on a good day the Amiga always felt like a house of cards ready to be blown over by the first breath of wind. When the breeze came, the user was left staring at an inscrutable “Guru Meditation Error” and a bunch of intimidating numbers. Sometimes the Amiga could seem positively designed to confound.

The Amiga anticipated the future, marked the beginning of a new era. It pointed forward to the way we live and compute today. I titled my book on the machine The Future Was Here for a reason. That aforementioned generational leap in graphics and sound was the most significant in the history of the personal computer in that it made the Amiga not just a new computer but something qualitatively new under the sun: the world’s first multimedia PC. With an Amiga you could for the first time store and play back in an aesthetically pleasing way imagery and sound captured from the real world, and combine and manipulate and interact with it within the digital environment inside the computer. This changed everything about the way we compute, the way we play, and eventually the way we live, making possible everything from the World Wide Web to the iPod, iPad, and iPhone. Almost as significantly, the Amiga pioneered multitasking on a PC, another feature enabled largely by that magnificent hardware that was able to stretch the 68000 so much farther than other computers. There is considerable psychological research today that indicates that, for better or for worse, multitasking has literally changed the way we think, changed our brains — not a bad claim to fame for any commercial gadget. When you listen to music whilst Skyping on-and-off with a friend whilst trying to get that term paper finished whilst looking for a new pair of shoes on Amazon, you are what the Amiga wrought.

The Amiga was stuck in the past way of doing things, thus marking the end of an era as well as the beginning of one. It was the punctuation mark at the end of the wild-and-wooly first decade of the American PC, the last time an American company would dare to release a brand new machine that was completely incompatible with what had come before. Its hardware design reflected the past as much as the future. Those custom chips, coupled together and to the 68000 so tightly that not a cycle was wasted, were a beautiful piece of high-wire engineering created by a bare handful of brilliant individuals. If a computer can be a work of art, the Amiga certainly qualified. Yet its design was also an evolutionary dead end; the custom chips and all the rest were all but impossible to pull apart and improve without breaking all of the software that had come before. The future would lie with modular, expandable design frameworks like those employed by the IBM PC and its clones, open hardware (and software) standards that were nowhere near as sexy or as elegant but that could grow and improve with time.

The Amiga was a great success, the last such before the Wintel hegemony expanded to dominate home computing like it already did business by the mid-1980s. Its gaming legacy is amongst the richest of any platform ever, including some fifteen years worth of titles that, especially during the first half of that period, broke boundaries at every turn and expanded the very notion of what a computer game could be. I won’t even begin to list here the groundbreaking classics that were born on the Amiga; suffice to say that they’ll be featuring in this blog for years to come. The Amiga was so popular a gaming platform in Europe that it survived many years after the death of its corporate parent Commodore, a phenomenon unprecedented in consumer computing. The last of the many glossy newsstand magazines devoted to it, Britain’s Amiga Active, didn’t cease publication until November of 2001, well over seven years after the platform became an orphan. It would prove to be just as long-lived in its other major niche of video-production workstation. Thanks to their unique ability to blend their own visuals with analog video signals — enabled, ironically, by those very same interlaced video modes that drove so many users crazy — Amigas could be found in the back rooms of small cable stations and video producers into the 2000s. Only the great changeover to digital HD broadcasting finally and definitively put an end to the Amiga’s career in this realm.

The Amiga was a bitter failure, one of the great might-have-beens of computer history. In 1985 so many expected it to become so much more than just another game machine or even “just” the pioneer of the whole new field of desktop video, forerunner of the YouTube generation. The Amiga, believed its early adopters, was so much better — not just technically better but conceptually better — than what was already out there that it was surely destined to conquer the world. After all, business-software heavy hitters like WordPerfect, Borland, Ashton-Tate, and Lotus knew a good thing when they saw it, were already porting their applications to it. And yet in the end only WordPerfect came through, for a while, and, while the Amiga did change the world in the long term, its innovations were refined and made into everyday life by Apple and Microsoft rather than the Amiga itself. The vast majority of heirs to the Amiga’s legacy today — a number which includes virtually every citizen of the developed world — have no idea a computer called the Amiga ever existed.

That’s just a sample of the contradictions awaiting any writer who tries to seriously tackle the Amiga as a subject. And there’s also another, more ironic sort of difficulty to be confronted: the sheer love the Amiga generated on the part of so many who had one. The Amiga, I must confess, was my own first computing love. Since that day in 1994 when I gave in and bought my first Wintel machine, I’ve been platform-agnostic. Linux and Apple zealots and Microsoft apologists all leave me cold, leave me wondering how people can get so passionate about any platform not called Amiga. Of course I’m smart enough to realize that none of this is really all that important, that a gadget is just that, a means to an end. I even recognize that, had the Amiga not come along when it did to pioneer a new paradigm for computing, something else would have. That’s just how history works. But still, there was something special about the Amiga for those of us who were there, something going far beyond even a hacker’s typical love for his first computer.

To say Amiga users had — still have — a reputation for zealotry hardly begins to state the case. General-computing magazines from the late 1980s until well into the 1990s learned to expect a deluge of hate mail from Amiga users every time they published an article that dared say an unfavorable word about the platform — or, worse, and as inevitably happened more and more frequently as time went on and the Amiga faded further from prominence, that didn’t mention it at all. Prominent mainstream columnist John C. Dvorak liked to say that, whereas Mac users were just arrogant and self-righteous, Amiga users were actively delusional. There are still folks out there clinging to their 25-year-old Amigas, patched together with the proverbial duct tape and baling wire, as their primary computing platform. A disturbing number of them are still waiting for the day when the Amiga shall rise again and take over the world, even as it’s hard to understand what a modern Amiga should even be or why it should exist in a world that long since incorporated all of the platform’s best ideas into slicker, simpler gadgets.

Every good cult needs an origin myth, and the Cult of Amiga is no exception. Beginning already in the machine’s North American heyday of the late 1980s, High Priest R.J. Mical, developer of the Amiga’s Intuition library of GUI widgets as well as other critical pieces of its software infrastructure, began traveling to trade shows and conventions telling in an unabashedly sentimental way the story of those earliest days, when the Amiga was being developed by a tiny independent company, itself called simply Amiga, Incorporated.

We were trying to find people that had fire, that had spirit, that had a dream they were trying to accomplish. Carl Sassenrath, the guy that did the Exec for the machine, it was his lifelong dream to do a multitasking operating system that would be a work of art, that would be a thing of beauty. Dale Luck, the guy that did the graphics, this was his undying dream since he was in college to do this incredible graphics stuff.

We were looking for people with that kind of passion, that kind of spirit. More than anything else, the thing that we were looking for was people who were trying to make a mark on the world, not just in the industry but on the world in general. We were looking for people that really wanted to make a statement, that really wanted to do an incredibly great thing, not just someone who was looking for a job.

Yes. Well. While idealism certainly has its place in the Amiga story, the story is also a very down-to-earth tale of competition inside Silicon Valley. It begins in 1982 with an old friend of ours: Larry Kaplan, one of the Fantastic Four game programmers from Atari who founded Activision along with Jim Levy.

Activison was flying high in 1982, the Fantastic Four provided in Kaplan’s own words with “limousine service, company cars, and a private chef” on top of a base salary of $150,000. Yet Kaplan, who is often described by others as the very apotheosis of “the grass is always greener,” was restless. He had the idea to form another company, one all his own this time, to enter the booming Atari VCS market. One day in early 1982 he called up an old colleague of his from the Atari days: Jay Miner, who had designed the Atari VCS’s display chip, then gone on to design the chipset at the heart of the Atari 400 and 800 home computers. Kaplan, along with two others of the Fantastic Four, had written the operating system and BASIC language implementation for those machines. He thus knew Miner well. Knowing the vagaries of business and starting his own company somewhat less well than he knew Miner and programming, his initial query was a simple one: “I’d like to start a company. Do you know any lawyers?”

Miner, who had left Atari at around the same time as the Fantastic Four out of a similar disgust with new CEO Ray Kassar, had also left Silicon Valley to move to Freeport, Texas, where he worked for a small semiconductor company called Zymos, designing chips for pacemakers and other medical devices. Miner said that, no, he wasn’t particularly well-acquainted with any lawyers, good or otherwise, but that his boss, Zymos founder Bert Braddock, had a pretty good head for business. He made the introduction, and Kaplan and Braddock hit it off. The plan that Kaplan presented to him was to combine hardware and software in the booming home videogame space, offering hardware to improve on the Atari VCS’s decidedly limited capabilities along with game cartridges that took advantage of the additional gadgetry. Such a scheme was hardly original to him; confronted with the VCS’s enormous popularity and equally enormous limitations, others were already working the same space. For example, two other former Atari engineers, Bob Brown and Craig Nelson, had already formed Starpath to develop a “Supercharger” hardware expansion for the VCS as well as games to play with it. (Starpath would go on to merge with the newly renamed Epyx — née Automated Simulations — and write games like Summer Games.)

Nevertheless, Braddock sensed a potentially fruitful partnership in the offing for a maker of chips like his Zymos. He found Kaplan some investors in nearby oil-rich Houston to put up the first $1 million or so to get the company off the ground. He also found and recruited one Dave Morse, a vice president of marketing at Tonka Toys, to join Kaplan, believing him to be exactly the savvy business mind and shrewd negotiator the venture needed. An informal agreement was reached amongst the group: Morse would run the new company; Kaplan would write the games; Miner (working under contract, being still employed by Zymos) would design the ancillary hardware; and Zymos would manufacture the hardware and the game cartridges. Somewhere at the back of everyone’s mind was the idea that, if they were successful with their games and add-on gadgets, they might just be able to take the next step: to make a complete original game console of their own, the successor to the Atari VCS that Ray Kassar’s Atari didn’t seem all that interested in seriously pursuing.

In June of 1982, Kaplan announced to his shocked colleagues at Activision that he was moving on to do his own thing; the bridges he thus burnt have never been mended to this day. He and Morse opened a small office in Santa Clara, California, for their new company, which Kaplan named Hi-Toro. Morse and Braddock — truly a sugar daddy to die for for a fledgling corporation — beat the bushes over the months that followed for additional financing, with success to the tune of another $5 million or so. The majority were dentists and other members of the medical establishment, thanks to Braddock’s connections in that field. They knew little to nothing about computer technology, but knew very well that videogames were hot, and were eager to get in on the ground floor of another Atari.

And then the squirrely Larry Kaplan nearly undid the whole thing. He called Atari founder Nolan Bushnell that October to talk up his new company, hoping to convince him to join Hi-Toro as chairman of the board; a name like his would confer instant legitimacy. Instead the hunter became the hunted. Bushnell, who was legendary for the buckets of charm at his fingertips, convinced Kaplan to come to him, convinced him they could start a new videogame company to rival Atari together, without Zymos or Morse or Miner. Just like that, Kaplan tendered his second shocking resignation of 1982. In the end, as Kaplan later put it, “Nolan, of course, flaked out,” leaving him high and dry, if quite possibly deservedly so. He would end up completing the circle by going back to Atari before the year was up, but that gig ended when the Great Videogame Crash of 1983 hit. Widely regarded as too untrustworthy to be worth the trouble inside the industry by that point, Kaplan’s career never recovered. On the plus side, he was able to cash out his Activision stock following that company’s IPO, making him quite a wealthy man and making future work largely optional anyway — not the worst of petards for a modern-day Claudius.

Dave Morse, meanwhile, was also left high and dry, with a company and an office and lots of financing but nobody to design his products. He asked Jay Miner to leave Zymos and join him full-time at Hi-Toro, to help fill the vacuum left by Kaplan’s departure. Miner, who had been nursing for some time now a dream of doing a game console and/or a computer based around the new Motorola 68000 and who saw Hi-Toro as just possibly his one and only chance to do that, agreed — so long as he could bring his beloved cockapoo Mitchy with him to the office every day.

One of the first things to go after Kaplan left was the company name he had come up with. Everyone Morse and Miner spoke to agreed that “Hi-Toro” was a terrible name that made one think of nothing so much as lawn mowers. Morse therefore started flipping through a dictionary one day, looking for something that would come before Apple and Atari in corporate directories. He hit upon the Spanish word for “friend”: “amigo.” That had a nice ring to it, especially with “user-friendliness” being one of the buzzwords of the era. But the feminine version of the word — “amiga” — sounded even better, friendly and elegant maybe even a little bit sexy. Miner by his own later admission was ambivalent about the new name, but everyone Morse spoke to seemed very taken with it, so he let it go. Thus did Hi-Toro become Amiga.

Of course, Morse and Miner couldn’t do all the work by themselves. Over the months that followed they assembled a team whose names would go down in hacker lore. An old colleague from Atari who had worked with Miner on the VCS as well as the 400 and 800, Joe Decuir, came in under a temporary contract to help Miner start work on a new set of custom chips. A few other young hardware engineers were hired as full-time employees. Morse hired one Bob Pariseau to put together a software team; he became essentially the equivalent of Jay Miner on that side of the house. The software people would soon grow to outnumber the hardware people. Among their ranks were now-legendary Amiga names like R.J. Mical, Dale Luck, and Carl Sassenrath.

The folks who came to work at Amiga were almost universally young and largely inexperienced. While tarring them with the clichéd “dreamers and misfits” label may be going too far, it is true that their backgrounds were more diverse than the Silicon Valley norm; Mical, for instance, was a failed English major who had recently spent nine months backpacking his way around the world. While their youthful idealism would do much to give the eventual Amiga computer its character, there was also a very practical reason that Morse had to fill his office with all these bright young sparks: what with financing getting harder and harder to come by as the videogame industry began to go distinctly soft, he simply couldn’t afford to pay for more experienced hands. Amiga’s financial difficulties provided the opportunity of a lifetime to a bunch of folks that may have struggled to get in the door in even the most junior of positions at someplace like Apple, IBM, or Microsoft.

The glaring exception to the demographic rule at Amiga was Jay Miner himself. Creative, bleeding-edge engineering is normally a young person’s game. Miner, however, was fully 50 years old when he created his masterpiece, the Amiga chipset. He’d been designing circuits already twenty years before the microprocessor even existed and well before some of his colleagues around the office were even born. Thanks perhaps to intermittent but chronic kidney problems that would eventually kill him at age 62, he looked and in some ways acted even older than his years, favoring quiet, contemplative hobbies like cultivating bonsai trees and carving model airplanes out of balsa wood. Adjectives like “fatherly” rival “soft-spoken” and “wise” in popularity when people who knew him remember him today. While the higher-strung Dave Morse became the face Amiga showed to the outside world, Miner set the internal tone, tolerating and even encouraging the cheerful insanity that was life inside the Amiga offices. Miner:

The great things about working on the Amiga? Number one I was allowed to take my dog to work, and that set the tone for the whole atmosphere of the place. It was more than just companionship with Mitchy — the fact that she was there meant that the other people wouldn’t be too critical of some of those we hired, who were quite frankly weird. There were guys coming to work in purple tights and pink bunny slippers. Dale Luck looked like your average off-the-street homeless hippy with long hair and was pretty laid-back. In fact the whole group was pretty laid-back. I wasn’t about to say anything — I knew talent when I saw it and even Parasseau who spread the word was a bit weird in a lot of ways. The job gets done and that’s all that matters. I didn’t care how solutions came about even if people were working at home.

The question of just what this group was working on, and when, is a harder question to answer than you might expect. When we use the word “Amiga” to refer to this era, we could be talking about any of three possibilities. Firstly, there’s Amiga the company, which during its early months put well over half of its personnel and resources into games and add-ons for the old Atari VCS rather than revolutionary new technology. Then there’s the Amiga chipset being designed by Miner and his team. And finally there’s a completed game console and/or computer to incorporate the chipset. Making sense of this tangle is complicated by revisionist retellings, which tend to find grand plans and coherent narratives where none actually existed. So, let’s take a careful look at each of these Amigas, one at a time.

The Amiga Joyboard

The Amiga Joyboard

Kaplan’s original plan had envisioned Hi-Toro/Amiga as a maker first and foremost of cartridges and hardware add-ons for the VCS, with a whole new console possibly to follow if things went gangbusters. These plans got reprioritized somewhat when Kaplan left and Miner came aboard with his eagerness to do a console and/or computer, but they were by no means entirely discarded. Thus Amiga did indeed create a handful of original games over the course of 1983, along with joysticks and other hardware. By far the most innovative and best-remembered of these products was something called the Joyboard: a large, flat slab of plastic on which the player stood and leaned side to side and front to back to control a game in lieu of a joystick. Amiga packaged a skiing game, Mogul Maniac, with the Joyboard, and developed at least two more — a surfing game called Surf’s Up and a pattern-matching exercise called Off Your Rocker — that never saw release. The Joyboard and its companion products have been frequently characterized as little more than elaborate ruses designed to keep the real Amiga project under wraps. In reality, though, Morse had high commercial hopes for this side of his company; he was in fact depending on these products to fund the other side of the operation. He spent quite lavishly to give the Joyboard a splashy introduction at the New York Toy Fair in February of 1983, and briefly hired former Olympic skier Suzy Chaffee — better known to a generation of Americans as “Suzy Chapstick” thanks to her long-running endorsement of that brand — to serve as spokesperson. His plans were undone by the Great Videogame Crash. The peripherals and games all failed miserably, precipitating a financial crisis at Amiga to which I’ll return shortly.

The chips were always Jay Miner’s babies. Known in the early days as Portia, Daphne, and Agnus, later iterations would see Portia renamed to Paula and Daphne to Denise. Combined with a 68000, they offered unprecedented audiovisual capabilities, including a palette of 4096 colors and four-channel stereo sound. Their most innovative features were the so-called “copper” and “blitter” housed inside Agnus. The former, which could also be found in a less advanced version in Miner’s previous Atari 400 and 800, could run short programs independent of the CPU to change the display setup on the fly in response to the perpetually repainting electron gun behind the television or monitor reaching certain points in its cycle. This opened the door to a whole universe of visual trickery. The blitter, meanwhile, could be programmed to copy blocks of memory from place to place at lightning speeds, and in the process perform transformations and combinations on the data  — once again, independent of the CPU. It was a miracle worker in the realm of fast animation. While not programmable in the same sense as the copper and the blitter, Denise autonomously handled the task of actually painting the display, while Paula could autonomously play back up to four sound samples or waveforms at a time, and also independently handle input and output to disk. (This is the briefest of technical summaries of the Amiga chipset. For a detailed description of the chipset’s internal workings as well as many important aspects of its host platform’s history that I’ll never get to in this game-focused blog, I point you again to my own book on the subject.)

Amiga’s ultimate vision for their chipset — whether in the form of a game console, a computer, a standup arcade game, or all three — is the most difficult part of all their tangled skein of intentionality to unravel, and the one most subject to revisionist history. Amiga fanatics of later years, desperate to have their platform accepted as a “serious” computer like the IBM PC or Apple Macintosh, became rather ashamed of its origins in the videogame industry. This has occasionally led them to say that the Amiga was always secretly intended to be a computer, that the videogame plans were just there to fool the investors and keep the money flowing. In truth, there’s good reason to question whether there was any real long-term plan at all. Miner noted in later interviews that the company was quite split on the subject, with — ironically in light of his later status of Amiga High Priest — R.J. Mical on the “investors’ side,” pushing for a low-cost game console, while others like Dale Luck and Carl Sassenrath wanted an Amiga computer. Miner himself claimed to have envisioned a console that could be expanded into a real computer with the addition of an optional keyboard and disk drive. (Amiga also had similar plans for the Atari VCS in the form of something to be called the Amiga Power Module, yet another project killed by the videogame collapse.) Dave Morse, who died in 2007, is not on record at all on the subject. One suspects that he was simply in wait-and-see mode through much of 1983.

What is clear is that the first Amiga machine to be shown to the public wasn’t so much a prototype of a real or potential computer or game console as the most minimalist possible frame to show off the capabilities of the Amiga chipset. Named after Morse’s wife, the Amiga Lorraine began to come together in the dying days of 1983, in a mad scramble leading up to the Winter Consumer Electronics Show that was scheduled to begin on January 4. Any mad scientist would have been proud to lay claim to the contraption. Miner and his team built their chipset, destined eventually to be miniaturized and etched into silicon, out of off-the-shelf electronics components, creating a pile of breadboards large enough to fill a kitchen table, linked together by a spaghetti-like tangle of wires, often precariously held in place with simple alligator clips. It had no keyboard or other input method; the software team wrote programs for it on a workstation-class 68000-based computer called the Sage IV, then uploaded them to the Lorraine and ran them via a cabled connection. The whole mess was a nightmare to maintain, with wires constantly falling off, pieces overheating, or circuits shorting out seemingly at random. But when it worked it provided the first tangible demonstration of Miner’s extraordinary design. Amiga accordingly packed it all up and transported it — very carefully! — to Las Vegas for its coming-out party at Winter CES.

R.J. Mical and Dale Luck, amongst others, had worked feverishly to create a handful of demos to show off in a private corner of Amiga’s CES booth, open only by invitation to hand-selected members of the press and industry. The hit of the bunch, written by Mical and Luck at the show itself in one feverish all-night hacking session fueled by “a six pack of warm beer,” was a huge, checked soccer ball that bounced up and down, prototype of one of the most famous computerized demos of all time. The bouncing soccer ball — the “boing” ball — would soon become the unofficial symbol of Amiga.


Boing and the other demos were impressive, but the hardware was obviously still in a very rough state, still a long, long way away from any sort of salable product. Many observers were frankly skeptical whether this mass of breadboards and wires even could be turned into the three chips Amiga promised, and if so whether those chips could, complicated as they must inevitably be, be cost-effectively manufactured. Two obvious applications of the chipset, to a new videogame console or to standup arcade games, were facing a gale-force headwind following the Great Videogame Crash of the previous year. Nobody wanted anything to do with that market anymore. And introducing yet another incompatible computer into the market, no matter how impressive its hardware, looked like a high-risk proposition as well. Thus most visitors were impressed but carefully noncommittal. Was there really a place for Amiga’s admittedly extraordinary technology? That was the question. Tellingly, of the glossy magazines, only Creative Computing bothered to write about Lorraine in any real detail, excitedly declaring it to have “the most amazing graphics and sound that will ever have been offered in the consumer market.” (Just to show that prescience isn’t always an either/or proposition, the same journalist, John J. Anderson, noted how important it would be to make sure any eventual Amiga computer was compatible with the IBM PCjr, which was sure to take over the industry.)

Thus Amiga’s coming-out party is best characterized as having mixed results on the whole, leading to lots of impressed observers but no new investors. And that was a big, big problem because Amiga was quickly running out of money. With the VCS products having not only failed to sell but also absorbed millions in their own right to develop, Amiga’s financial picture was getting more desperate by the week. One thing was becoming clear: there was no way they were going to be able to secure the investment needed to turn the Lorraine into a completed computer — or a completed anything else — and market it themselves. It seemed that they had three options: license the technology to someone else with deeper pockets, sell themselves outright to someone else, or go quietly out of business. As the founders mortgaged their houses to make payroll and Morse begged his creditors for loan extensions, the only company that seemed seriously interested in the Amiga chipset was the one Jay Miner would least prefer to get in bed with once again: Atari.

An Atari old-timer named Mike Albaugh had first visited Amiga well before the CES show, in November of 1983. He was given an overview of the as-yet-extant-on-paper-only chipset’s features and, knowing very well the capabilities of Jay Miner, expressed cautious interest. After their first tangible glimpse of the chipset’s capabilities at CES, Atari got serious about acquiring this incredible technology from a company that seemed all but at their mercy, desperate to make a deal that would let them stay alive a little longer. With no other realistic options on the table, Dave Morse negotiated with Atari as best he could from his position of weakness. Atari had no interest in buying a completed machine, whether of the game-console or computer variety. They just wanted that wonderful chipset. The preliminary letter of intent that Amiga and Atari signed on March 7, 1984, reflects this.

That same letter of intent, and the $500,000 that Atari transferred to Amiga as part of it, would lead to a legal imbroglio lasting years. The specifics that the letter contained, as well as — equally importantly — what it did not contain, remain persistently misunderstood to this day. Thankfully, the original agreement has been preserved and made available online by Atari historians Marty Goldberg and Curt Vendel. I’ve taken the time to parse this document closely, and also enlisted the aid of a couple of acquaintances with better legal and financial minds than my own. Because it’s so critical to the story of Amiga, and because it’s been so widely misunderstood and misconstrued, I think it’s worth taking a moment here to look fairly closely at its specifics.

The document outlines a proposed arrangement granting Atari exclusive license to the chipset for use in home videogame consoles and standup arcade games, in perpetuity from the time that the finalized agreement is signed. The proposal also grants Atari a nonexclusive license to use the chips in a personal computer, subject to the restriction that Atari may first offer an add-on kit to turn a game console using the chips into a full-blown computer in June of 1985, and a standalone computer using the chips only in March of 1986. Before and continuing after Atari makes their computer using the chips, Amiga may make one of their own, but may only sell it through specialized computer dealers, not mass merchandisers like Sears or Toys ‘R’ Us. Atari, conversely, will be restricted to the mass merchandisers. The obvious intention here is to target Amiga’s products to the high-end, professional market, Atari’s to gamers and casual users. Atari will pay Amiga a royalty of $2 per computer or game console containing the chipset sold, $15 per standup arcade videogame. Note that the terms I’ve just described are only a proposal pending a finalized license agreement, without legal force — unless certain things happen to automatically trigger their going into effect, which I’ll get to momentarily.

Now let’s look at the parts of the document that do have immediate legal force. Amiga being starved for cash and still needing to do considerable work to complete the chipset, Atari will give Amiga an immediate “loan” of $500,000, albeit one which they never really expect to see paid back; again, I’ll explain why momentarily. Atari will then continue to give Amiga more loans on a milestone basis: $1 million when a finalized licensing agreement is signed; $500,000 when each of the three chips is completed and delivered to Atari ready for manufacturing. And here’s where things get tricky: once all of the chips are delivered and a licensing agreement is in place, Amiga’s outstanding loan obligations will be converted into a purchase by Atari of $3 million worth of Amiga stock. If, on the other hand, a finalized licensing agreement has not been signed by March 31 — just three weeks from the date of this preliminary agreement — Amiga will be expected to pay back the $500,000 to Atari by June 30, plus interest of 120 percent of the current Bank of America prime rate, assuming some other deal is not negotiated in the interim. If Amiga cannot or will not do so, the proposed licensing agreement outlined above will automatically go into effect as a legally binding contract, with the one very significant change that Atari will not need to pay any royalties at all — the license “shall be fully paid in exchange for cancellation of the loan.” The Amiga chipset thus serves as collateral for the loan, its blueprints and technical specifications being held in escrow by a neutral third party (the Bank of America).

There are plenty of other technicalities — for instance, Atari will be allowed to bill Amiga for their time and other resources if Amiga fails to complete the chipset, thus forcing Atari’s engineers to finish the job — but I believe I’ve covered the salient points here. (Those deeply interested or skeptical of my conclusions may want to look at a more detailed summary I prepared, or, best of all, just have a look at the original.) Looking at the contract, what jumps out first is that it wasn’t a particularly good deal for Amiga. To pay a mere $2 per console or computer sold when the chipset being paid for must be the component that literally makes that console or computer what it is seems shabby indeed. For Atari it would have represented the steal of the century. Why would Morse sign such an awful deal?

The obvious answer must of course be that he was desperate. While it’s perhaps dangerous to ascribe too much motivation to a dead man who never publicly commented on the subject, circumstantial evidence would seem to characterize this agreement as the wind-up to a final Hail Mary, a way to secure a quick $500,000 for the here and now, to keep the lights on a little longer and hope for a miracle. Morse did not sign a final licensing agreement by March 31, a very risky move indeed, as it gave Atari the right to automatically start using Amiga’s chipset, without having to pay Amiga another cent, if Morse couldn’t negotiate some other arrangement with them or find some way to pay back the $500,000 plus interest before June 30. Carl Sassenrath once described Morse as “my model for how to be cool in business.” Truly he must have had nerves of steel. And, incredibly, he would get his miracle.

(Sources: On the Edge by Brian Bagnall. Amiga User International of June 1988 and March 1993. Info of January/February 1987 and July/August 1988. Creative Computing of April 1984. Amazing  Computing, premiere issue. InfoWorld of July 12 1982. Commander of August 1983. Scott Stilphen’s interview with Larry Kaplan on the 2600 Connection website. Thanks also to Marty Goldberg for patiently corresponding with me and giving me Atari’s perspective, although I believe his conclusions about the Amiga/Atari negotiations and particularly his reading of the March 7 1984 agreement to be in error. And yeah, there’s my own book too…)

 
 

Tags: , , ,

How Things Work: Commodore 64 and Summer Games Edition

I’m always trying to convey a sense of the audacity and creativity of hackers of the early PC era, who made so much out of so little. I include amongst this group both the hardware hackers who created the machines themselves and the software hackers who took them to places even their creators never imagined. In that spirit, I thought today we’d look at how the Commodore 64’s hardware team managed to make it do some of what it could given the technical constraints under which they labored, and how the software team who created Summer Games at Epyx found ways to make it do even more than they had fully considered. So, much of this article is for the gearheads among you, or at least those of you who’d like to understand a bit more of what the gearheads are on about. If you’re a less technical sort, perhaps you’ll be consoled by learning about some of the softer factors that went into the Summer Games design as well. And if that’s not interesting, hey, you can still watch my wife and I (mostly I) fail horribly at various Summer Games events via the movie clips.

This is, by the way, my first attempt to make use of WordPress 3.6’s integrated video capabilities. You’ll need an up-to-date browser with good HTML 5 support to see the clips. Hopefully my site won’t choke on the bandwidth demands. We’ll see how we go.

While you’re waiting (hopefully not too long) for the videos to load, let’s consider the basic visual capabilities of the Commodore 64: a palette of 16 colors at a resolution of 320 X 200. Those capabilities are, to say the least, modest by modern standards, but they actually present a huge problem when paired with another key specification: the 64 has just 64 K of RAM memory. This is all there is to work with; there is no separate bank of video memory, as on a modern computer. Everything — programs, data, the contents of the screen, and miscellaneous other things like buffers for the disk drive — must draw from this pool.

Now, a modern programmer wishing to represent a 320 X 200 screen with 16 colors in memory would probably just store it as a series of pixels, with one byte devoted to each pixel and storing a value of between 0 and 15 to represent that pixel’s color. This approach, known as bitmap graphics, is straightforward and eminently flexible, but there’s a problem. Consider: a 320 X 200 screen has exactly 64,000 pixels. In other words, by devoting one byte to each pixel we’ve just filled our entire 64 K of memory with a single screen.

Let’s consider then. Even a modern programmer, if she’s a more efficient sort, might note that we only actually need four bits to store a number between 0 and 15, and could therefore, at the cost of a bit more confusing layout, pack two pixels into every byte. That reduces consumption to a little under 32 K — better, but it’s still untenable to devote half of our precious memory to the screen.

It’s because bitmap graphics are so demanding that only high-end machines like the Apple Lisa and Macintosh used them by default at the time of Summer Games‘s release. And, notably, even those 68000-powered machines only displayed black and white, which reduced the requirement from four bits per pixel to one — a simple on-off, black-or-white toggle. Let’s consider the alternative that the 64’s designers, as well as those of many other machines, employed in various ways: character graphics.

Commodore 64 startup screen

In its default mode, the 64 subdivides its screen into a grid of character cells, each 8 X 8 pixels. Thus there are 40 of them across and 25 down, corresponding to the machine’s standard text display. Elsewhere in memory are a set of up to 256 tiles that can be copied into these cells. A default set, containing the glyph for each letter, number, and mark of punctuation in addition to symbols and simple line-drawing figures, lives in ROM. The programmer can, however, swap this set out for her own set of tiles. This system is conceptually the same as the tile-graphics system which Richard Garriott used in the Ultima games, but these tiles are smaller (only the size of a single character) and monochrome, just a set of bits in which 1 represents a pixel in the foreground color, 0 a pixel in the background color. The latter color is set globally, for the whole screen. The former is specified individually for each cell, via a table stored elsewhere in memory.

So, let’s look at what all this means in terms of memory. Each cell on the screen consumes one byte, representing the number (0 to 255) of the tile that is placed there. There are 1000 character cells on a 40 X 25 display, so that’s about 1 K consumed. We need 8 bytes to store each tile as an 8 X 8 grid of on-off pixels. If we use all 256, that’s 2 K. Finally, the color table with the foreground color for each cell fills another 1 K. We’ve just reduced 32 K to 4 K, or just 2 K if we use the default set of character glyphs in ROM. Not bad. Of course, we’ve also introduced a lot of limitations. We now have to build our display, jigsaw-puzzle style, from our collection of tiles. And each cell can only use two of our total of 16 colors, one of which can be unique to that cell but the other of which must be the same for the entire screen. For someone wishing to make a colorful game, this last restriction in particular may just be too much to accept.

Enter multicolor character mode. Here, we tell the 64 that we want each tile to be not monochrome but drawn in four colors. Rather than using one bit per pixel within the tile, we now use two, which allows us to represent any number from 0 to 3. One of these colors is still set individually for each cell; the other three are set globally, for the screen as a whole. And there’s another, bigger catch: because we still only devote eight bytes to each tile, we must correspondingly reduce its resolution, and that of the screen as a whole. Each tile is now 4 X 8 (horizontally elongated) pixels, the screen as a whole 160 X 200. Even so, this is easily the most widely used mode in Commodore 64 games. It’s also the mode that Scott Nelson (little brother of Starpath co-founder Craig Nelson) chose for Summer Games‘s flag selection screen.

Summer Games country selection screen

But… wait, you might be saying. Surely the colorful screen shown above doesn’t always use the same three of the four colors within each tile. In fact, it doesn’t, and this introduces us to one of the keys to getting the most out of the Commodore 64: raster interrupts.

The picture on a cathode-ray-tube television or monitor is generated by an electron gun which moves across and down behind the screen, firing charged electrons at phosphors that coat the back of the screen glass. This causes them to briefly glow — so briefly, in fact, that the gun must paint the screen 60 times per second for televisions using the North American NTSC standard, or 50 times for the European PAL standard, in order to display a stable image without flicker. After painting each line of the screen from left to right, the gun must move back to the left to paint the next. This split second’s delay can be exploited by the Commodore 64 programmer. She can ask the machine to generate what’s known as a raster interrupt when the gun finishes painting a given line. She then has a few microseconds to make changes to the display configuration before the gun starts painting the next line. She can, for example, change one or more of the three supposedly fixed colors, as Scott Nelson does to generate the screen shown above.

But let’s say we don’t want to deal with trying to create a picture using tiles. The Commodore 64 actually does also offer a bitmap mode of sorts, albeit one with restrictions of its own that allow it to reduce the memory footprint from an untenable 32 K to a more reasonable if still painful 9 K. Here an 8 K chunk of memory is allocated to the bitmap, with each bit representing the status (on or off) of a single pixel. The foreground color represented by an “on” pixel is once again determined by a 1 K color table, with the colors still sorted into 8 X 8 pixel blocks. This leads to the most obvious oddity of the 64’s bitmap mode: the bitmap does not run all the way across the screen and then down, but rather across and down through each 8 X 8 cell that is assigned a given foreground color.

Bitmap mode on the Commodore 64

For those willing to trade resolution for colors, there is also a multicolor bitmap mode, which, like the multicolor character mode, treats each two bits as representing a single pixel of one of four possible colors. Horizontal resolution is accordingly reduced to 160 pixels. This mode is, however, more flexible than multicolor character mode in its choice of colors. Another area of memory, of 1 K, is allocated to a collection of color pairs for each cell, each pair packed into a single byte. Thus we can freely choose three of the four colors found within each cell without resorting to raster interrupts or other tricks. Total memory devoted to the display in multicolor bitmap mode amounts to 10 K.

That may not look like much at first glance, but for a programmer trying to shoehorn a complex game into 64 K it’s quite a sacrifice indeed. For this reason, and because its other restrictions could make it almost as challenging to work with as character mode, bitmap mode is not used as often as you might expect in Commodore 64 games. Summer Games is, however, a partial exception, employing bitmap mode in quite a number of places. For instance, Stephen Landrum’s opening-ceremonies sequence uses a multicolor bitmap. This sequence also demonstrates another critical part of the 64’s display hardware: sprites.


Doing animation by changing the contents of screen memory is very taxing on a little 8-bit CPU like the 64’s 6502, not to mention tricky to time so that changes are not made in the middle of screen paints, which would result in ugly jerking and tearing effects. Sprites come to the rescue. Indeed, their presence or absence is a good indication of whether a given machine from this era is pretty good at playing graphically intense games (the 64, the Atari 8-bit lines) or not (the Apple II, the IBM PC). A sprite is a relatively small graphical element which is overlaid onto the physical screen, but independent of the bitmap or tile map stored in memory. It can be moved about quickly at minimal cost, just by changing a couple of registers. The display circuitry does the rest.

The 64 offers eight sprites to the programmer, each exactly 24 pixels wide by 21 tall. The image for each is stored in memory as the usual grid of on/off bits, for the modest total of 64 bytes used per sprite. An on bit represents the sprite’s color, of which each has exactly one; an off bit represents transparency, so that whatever is on the screen behind shows through. This means that the 24 X 21 pixel size is not so arbitrary as it may first appear; a smaller sprite can be displayed simply by turning off the unneeded pixels.

There is also the inevitable multicolor sprite, which gives us three foreground colors to work with at the expense of half of our horizontal resolution. In this mode, the sprite is effectively just 12 X 21 pixels, but each pixel is now twice as wide as before, resulting in the same physical width on the screen. As in multicolor character mode, the second and third colors are fixed across all sprites in this mode.

A sprite can be pointed to different addresses in memory for its image between screen paints, creating the possibility of making animated sprites which cycle through a sequence of frames, page-flip style. Likewise, single- and multicolor sprites can be placed together and moved in lockstep to create larger or more complex onscreen figures. In the sequence above, the runner is made from three single-color sprites, each of which cycles through 14 frames of animation. (If you’ve played Impossible Mission, he may look familiar to you: he is in fact the same sprite as your avatar in that game, which Dennis Caswell happily shared with his colleagues.) The flames are four multicolor sprites, each with four frames of animation. And each of the eight doves is a single single-color sprite of eight animation frames.

But… again, wait. That’s far more than eight sprites in total, isn’t it? As you may have guessed, Landrum uses raster interrupts to reconfigure and thus reuse sprites as each screen paint proceeds. With the addition of such tricks the 64’s effective limit becomes not eight sprites in total but no more than eight sprites horizontally parallel with one another.

Let’s take another example, this time one showing an actual, interactive event in action: Stephen Landrum’s pole vault. I have my usual mediocre performance in the clip that follows, but my wife Dorte kicks some ass and actually demolishes our old world record.


The screen you see here is another multicolor bitmap. The vaulter is made up of three single-color sprites, which cycle through seven frames of animation as he runs and are then changed appropriately to reflect his state after he goes airborne. The pole is three single-colored sprites and the crossbar is a single multicolored sprite, as is, surprisingly and cleverly, the stationary top of the nearer (right-hand) upright. To understand this last, we have to understand the 64’s concept of sprite priority. Sprites are numbered from 0 to 7. If two sprites overlap one another, the sprite with the lower number is drawn on top of the one with the higher number. Landrum uses this property to easily create the illusion of the jumper passing behind the nearer upright as he soars through the air.

You might have noticed that the pole, the crossbar, and the upright are all quite large. This is down to yet another feature of the 64’s sprite system. It’s possible to expand a sprite vertically or horizontally or both, doubling its size (but not its resolution).

The pole vault is not quite as polished as most of the events, which may be a sign that, as one of the later events completed, it was a bit rushed. There’s some odd artifacting in the pole, for instance. And there’s a wonderful bug that lets you vault under the crossbar on its highest setting, creating a world record for the ages.


The two swimming events, which were started by Randy Glover but finished by Landrum following the former’s abrupt resignation, are the most complex in Summer Games. They’re largely an exercise in rhythm; you have to press the joystick button as your swimmer’s arms enters the water, then release it when they emerge. I’m awful at it, but Dorte is pretty good.


The clock at top right is formed from six single-color sprites, each swimmer from four. The rest of what you see here may begin to illustrate how crazy you can get with raster interrupts. Each paint begins with the 64 in single-color bitmap mode. This allows the text (“Ready… Set… Go!”), which is drawn and erased directly into the bitmap, to be rendered in the higher resolution. But then, just as the electron gun reaches the top of the stands, the screen is changed to a multicolor bitmap.

Glover and Landrum use a technique known as double buffering to make the scrolling as smooth as possible. There are actually two bitmaps in memory, one of which is always being displayed and the other of which is being updated by the CPU for the next step in the scroll. When the time comes, the two are swapped, as the 64’s VIC-II graphics chip is pointed to the other in the pair. Well, it’s almost that simple. Complications arise because the poor 6502 just doesn’t have time to completely redraw a screen in memory for every pixel of scroll. Luckily, it doesn’t have to. The VIC-II also has what are known as horizontal and vertical fine-scrolling registers. They allow the programmer to shift the bitmap that appears onscreen by from 1 to 7 bits to the right (as in the swimming events) or down. Since this will create an ugly empty zone at the edges of the display for which the computer has no pixel data to display, another register lets the programmer expand the size of the border slightly to cover these cells — the width of the screen is reduced from 40 to 38 columns, or the height from 25 to 24 lines. Now it’s possible for Glover and Landrum to scroll the screen eight pixels before having to swap to the alternate bitmap, giving the CPU time to make said bitmap. Double buffering is rather unusual to find on the 64, as it’s horrendously expensive in memory. And indeed, the swimming events use virtually every last byte.

But that’s probably enough tech talk for today. Just for fun — and because if you got through all that you’ve earned it — let’s look at the other events in somewhat less exhaustive (exhausting?) detail.

The two running events have their origin in Starpath’s old Supercharger decathlon project, but were brought to the 64 and completed by Brian McGhie. Like virtually everyone at Epyx, he had no particular knowledge of or burning interest in Olympic sports. He therefore relied on a stack of old Sports Illustrateds to try to get the look of his runners and the stadium right. The events are very similar in appearance, but unlike the swimming events very different in execution. The 100 meter dash is a notorious joystick killer. You have to move the stick back and forth as quickly as possible — nothing more, nothing less. The 4 X 400 meter relay, by contrast, is the most cerebral of the events, a game of energy conservation and chicken. I’m unaccountably good at both, much to Dorte’s frustration.


Interestingly, the scrolling in these events is implemented in an entirely different way from that in the swimming, illustrating how very much Summer Games is really a collection of individual efforts brought together under one banner. McGhie uses a multicolor character screen, and rather than using double buffering updates the hidden border areas on the fly to… but I promised to stop with the tech talk, didn’t I?

The diving event is yet another of Landrum’s. The diver here rather disconcertingly never surfaces after entering the water, simply because Landrum ran out of time.


Skeet shooting was a joint project of John Leupp, Steve Mudry, and Randy Glover prior to his departure. They originally planned to show the shooter on the screen, as in all of the other events, but found it difficult to work out a practical way of implementing the event from that perspective. So skeet shooting received the only first-person perspective in Summer Games, and the poor shooter was left out entirely.


Finally, there’s the gymnastics event — really just a vault — by Mudry. In an example of the, shall we say, casual approach to box art that was so rife in this era, the Summer Games box shows someone doing a handstand.


If nothing else, this article has hopefully conveyed what a tricksy machine the Commodore 64 is, full of hidden capabilities and exploitable quirks. Learning to make it dance for you requires considerable time even if you have examples to follow. If you don’t… well, small wonder that its games were just beginning to come into their own in 1984, the year it had its second birthday. And Epyx and companies like it were barely scratching the surface. In a couple of years Summer Games would look downright quaint.

You can download the original Commodore 64 Summer Games and its manual from here if you like, for use in the emulator of your choice (I recommend VICE). Unlike most of the disk images floating around the Internet, this one is pristine, with the original set of world records, so you and your friends and/or family can make your own records — which is about 20% of the fun of playing Summer Games — rather than be shamed by the performances of obsessed teenagers from two or three decades ago.

We’ll continue to observe the Commodore 64 scene with interest in future articles. But next we’ll check in with a group of Atari 8-bit loyalists: the backwoods savants of Ozark Softscape.

(This article draws again from the Epyx retrospectives in the July 1988 and August 1989 issues of Commodore Magazine. Technical details of Summer Games were drawn from the Commodore 64 design case study which appeared in the March 1985 IEEE Spectrum. I also lifted the diagram showing the 64’s unusual bitmap mode from there. For what it’s worth, my favorite 64 technical reference is Mapping the Commodore 64 by Sheldon Leemon. And if I may be forgiven a blatant plug, do check out my book on the Amiga if you’re interested in the sort of technical details I’ve delved into in this post. Some of what I go into in the book actually apply equally to the 64, and I explain basic concepts, starting with what a bit and byte actually are, much more fully there.)

 
 

Tags: , , ,

A Computer for Every Home?

On January 13, 1984, Commodore held their first board of directors meeting of the year. It should have been a relaxed, happy occasion, a time to make plans for the new year but also one last chance to reflect on a stellar 1983, a year in which they had sold more computers than any two of their rivals combined and truly joined the big boys of corporate America by reaching a billion dollars in gross sales. During the last quarter of 1983 alone they had ridden a spectacular Christmas buying season to more than $50 million in profits. Commodore had won the Home Computer Wars convincingly, driving rival Texas Instruments to unconditional surrender. To make the triumph even sweeter, rival Apple had publicly announced the goal of selling a billion dollars worth of their own computers that year, only to fall just short thanks to the failure of the Lisa. Atari, meanwhile, had imploded in the wake of the videogame crash, losing more than $500 million and laying off more than 2000 workers. Commodore had just the previous summer moved into a sprawling new 585,000 square-foot, two-story headquarters in West Chester, Pennsylvania that befitted their new stature; some of the manufacturing spaces and warehouses in the place were so large that Commodore veterans insist today that they had their own weather. Yes, it should have been a happy time at Commodore. But instead there was doubt and trepidation in the air as executives filed into the boardroom on that Friday the 13th.

A day or two before, Jack Tramiel had had a heated argument with Irving Gould, Commodore’s largest shareholder and the man who controlled his purse strings, in the company’s private suite above their exhibit at the 1984 Winter Consumer Electronics Show. That in itself wasn’t unusual; these two corrupt old bulldogs had had an adversarial relationship for almost two decades now. This time, however, observers remarked that Gould was shouting as much as Tramiel. That was unusual; Gould normally sat impassively until Tramiel exhausted himself, then quietly told him which demands he was and wasn’t willing to meet. When Tramiel stormed red-faced out of the meeting and sped away in the new sports car he’d just gotten for his 55th birthday, it was clear that this was not just the usual squabbling. Now observers outside the board-of-directors meeting, which was being chaired as usual by Gould, saw him depart halfway through in a similar huff. He would never darken Commodore’s doors again.

No one who was inside that boardroom has ever revealed exactly what transpired there. With Gould and Tramiel both now dead and the other former board members either dead or aged, it’s unlikely that anyone ever will. On the face of it, it seems hard to imagine. What could cause these two men who had managed to stay together through the toughest of times, during which Commodore had more than once teetered on the edge of bankruptcy, to irrevocably split now, when their company had just enjoyed the best year in its history? We can only speculate.

Commodore had ceased truly being Tramiel’s company in 1966, when Gould swooped in to bail him out from the Financial Acceptance Scandal of the previous year. Tramiel, however, never quite got the memo. He continued to run the company like a sole proprietor to whatever extent that Gould would let him. Tramiel micro-managed to an astonishing degree. He did not, for instance, believe in budgets, considering them a “license to steal,” a guarantee that the responsible manager, knowing he had X million available, would always spend at least X million. Instead he demanded that every expenditure of greater than $1000 be approved personally by him, with the result that much of the company ground to a halt any time he took a holiday. Even as Tramiel enjoyed his best year ever in business, Gould and others in the financial community were beginning to ask the very reasonable question of whether this was really a sustainable way to run a billion-dollar company.

Still, the specific cause of Tramiel’s departure seems likely to have involved his sons. Tramiel valued family above all else, and, like a typical small businessman, dreamed of leaving “his” company to his three sons. Whether by coincidence or something else, it even worked out that each son had an area of expertise that would be critical to running a company like Commodore. Sam, the eldest, had trained in business management at York University, while Gary, the youngest, was a financial analyst with a degree from Manlow Park College and experience as a stockbroker at Merrill Lynch. Leonard, the middle child, was the intellectual and the gearhead; he was finishing a PhD in astrophysics at Columbia, and was by all accounts quite an accomplished hardware and software hacker. Sam and Gary already worked for Commodore, while Leonard planned to start as soon as he finished his PhD in a few more months. Various witnesses have claimed that Tramiel the elder now wished to begin more actively grooming this three-headed monster to take more and more of his responsibilities, and someday to take his place. Feeling nothing good could come out of such blatant nepotism inside a publicly traded corporation that was trying to put its somewhat seedy history behind it, Gould refused absolutely to countenance such a plan. Given Tramiel’s devotion to his family and his attitude toward Commodore as his personal fiefdom, it does make a degree of sense that this particular rejection might have been more than he could stomach.

In any case, Tramiel was gone, and Gould, who had made his fortune in the unglamorous world of warehousing and shipping and was reportedly both a bit jealous of Tramiel’s high profile in an exciting, emerging industry and a bit embarrassed by his gruff, untutored ways, didn’t seem particularly distraught about it. The man he brought in to replace him could hardly have been more different. Marshall F. Smith was a blandly feckless veteran of boardrooms and country clubs who had spent his career in the steel industry. It’s hard to grasp just why Gould latched onto Smith of all people. Perhaps he was following the lead of Apple, who the previous year had brought in their own leader from outside the computer industry, John Sculley. Sculley, however, understood consumer marketing, having cut his teeth at Pepsi, where he was the mastermind behind the Pepsi Challenge, still one of the most iconic and effective advertising campaigns in the long history of the Cola Wars. The anonymous world of Big Steel offered no comparable experience. Smith’s appointment was the first of a long string of well-nigh incomprehensible mistakes Gould would make over the next decade. Engineers that were initially thrilled to have proper funding and actual budgets at last were soon watching with growing concern as Smith puttered about with a growing management bureaucracy and let the company drift without direction. Many were soon muttering that it’s often better to make a decision — even the wrong decision — than to just let things hang. Whatever else you could say about Jack Tramiel, he never lacked the courage of his convictions.

Commodore’s first significant new models, which reached stores at last in October of 1984, more than two years after the Commodore 64, hardly did much to inspire confidence in the new regime. Nothing about the Commodore 16 and the Plus/4 made any sense at all. The 16 was an ultra-low-end model with just 16 K of memory, long after the time for such a beast had passed. The trend in even inexpensive 8-bit computers was onward, toward the next magic number of 128 K, not backward to the late 1970s.

The Commodore Plus/4

The Commodore Plus/4

As for the Plus/4, which like the 64 was built around a variant of the 6502 CPU and had the same 64 K of memory but was nevertheless incompatible… well, it was the proverbial riddle wrapped in a mystery inside an enigma. It was billed as a more “serious” machine than the 64, a computer for “home and business applications” rather than gaming, and priced to match at about $300, more than $100 more than the 64. It featured four applications built right into its ROM (thus the machine’s name): a file manager, a word processor, a spreadsheet, and a graphing program. All were pathetically sub-rate even by the standards of Commodore 64 applications, hardly the gold standard in business computing. The Plus/4 lacked the 64’s sprites and SID sound chip, which made a degree of sense; for a dismaying number of years yet a lack of audiovisual capability would be taken as a signifier of serious intent in computing. But why did it offer more colors, 128 as opposed to the 64’s 16? And as an allegedly more serious computer, why didn’t it offer the 80-column display absolutely essential for comfortable word processing and other typical productive tasks? And as a more serious (and expensive) computer, why did it have a rubbery keyboard almost as awful to type on as the IBM PCjr’s Chiclet model? And would all those serious, more productive buyers really be doing a lot of BASIC programming? If not, why was one of the main selling points a much better BASIC than the bare-bones edition found in the 64? Info, a magazine that would soon build a reputation for saying the things about Commodore’s bizarre decisions that nobody else would, gave the Plus/4 a withering review:

The biggest problem with the Plus/4 is the fundamental concept: an 8-bit, 64 K, 40-column desktop personal computer. Commodore already makes the best 8-bit, 64 K, 40-column desktop personal computer you can buy, with literally thousands of products supporting it! Why should consumers want a “new” machine with no significant advances, several new limitations, and virtually no third-party product support? And why would a company with no competition in the under-$500 category bring out an incompatible [machine] that can’t compete with anybody’s machine except their own? It just doesn’t compute!

Info ran a wonderfully snarky contest in the same issue, giving away the Plus/4 they’d just reviewed. After all, it was “sure to become a collector’s item!” Even the more staid Compute!’s Gazette managed to flummox a poor Commodore representative with a single question: “Why buy a 264 [a pre-release name for the Plus/4] instead of a 64 that has a word processor and, say, a Simon’s BASIC? It would be the equivalent of the 264 for less money.” Commodore happily claimed that the Plus/4 had enough utility built right in for the “average small business” (maybe they meant one of the vast majority that fail within a year or two anyway), but in reality it seemed like it had been cobbled together from spare parts that Commodore happened to have lying around. In fact, that’s not far from what happened — and Tramiel actually bears as much responsibility for the whole fiasco as the clueless Marshall Smith.

Tramiel, you’ll remember, had driven away the heart of his engineering team in his usual hail of recriminations and lawsuits shortly after they had created the 64 for him. He did eventually find more talented young engineers, notably Bil Herd and Dave Haynie. (Commodore always preferred their engineers young and inexperienced because that way they didn’t have to pay them much — a strategy that sometimes backfired but was sometimes perversely successful, netting them brilliant, unconventional minds who would have been overlooked by other companies.) When Herd arrived at Commodore in early 1983, engineers had been tinkering for some time with a new video and audio chip, the TED (short for Text Display). With engineering straitened as ever by Tramiel’s aversion to spending money, the 23-year-old Herd soon found himself leading a project to make the TED the heart of a new computer, despite the fact that it was in some ways a step back, lacking the sprites of the 64’s VIC chip and the marvelous sound capabilities of its SID chip. Marketing came up with the dubious idea of including applications in ROM, which by all accounts delighted Tramiel.

Tramiel, who at some fundamental level still thought of the computers he now sold like the calculators he once had, failed to grasp that the whole value of a computer is the ability to do lots of different things with it, to have lots and lots of options its designers may never have anticipated, all through the magic of software. Locking applications into ROM, making them impossible to replace or update, was kind of missing the point of building a computer in the first place. Failing to understand that a computer is only as good to consumers as the quality and variety of its available software, Tramiel also saw no problem with making the new machine incompatible with the 64. It seems to have come as a complete surprise to him when the machine was announced at that fateful Winter CES and everyone’s first question was whether they could use it to run the Commodore 64 software they already had.

After Tramiel’s abrupt departure, Commodore pushed ahead with the 16 and Plus/4 in the muddled way that would be their wont for the rest of the company’s life, despite a skeptical press and utterly indifferent consumers. It all made so little sense that some have darkly hinted of a conspiracy hatched by Tramiel amongst his remaining loyalists at Commodore to get the company to waste resources, time, and credibility on these obvious losers. (Tramiel recruited a substantial number of said loyalists to join him after he purchased Atari and got back in the home-computer game — exactly the sort of thing for which he so often sued others. But that’s a story for a later article.) Incredibly given the cobbled nature of the machine, it took nine more months after that CES to finally get the 16 and Plus/4 into production and watch them duly flop. Again, such a glacial pace would prove to be a consistent trait of the post-Tramiel Commodore.

By the time they did appear at last, the poor, benighted 16 and Plus/4 had more working against them than just their own failings, considerable as those may have been. The year as a whole was marked by failures in the home-computer segment of the market. Atari was reeling. Coleco was taking massive losses on their tardy entry into the home-computing field, the Adam. And of course I’ve already told you about the IBM PCjr.

Even Apple, who had enjoyed a splashy, successful launch of their new higher-end Macintosh (another story for a later date), had a somewhat disappointing new model amongst their bread-and-butter Apple II line. The “c” in the Apple IIc’s name stood for “compact,” and it was indeed a much smaller version of Steve Wozniak’s old evergreen design. Like the Macintosh, it was a closed system designed for the end user who just wanted to get work (or play) done, not for the hackers who had adored the earlier editions of the II with their big cases and heaps of inviting expansion slots. The idea was that you would get everything you, the ordinary user, really needed built right in: all of the fundamental interface cards, a disk drive, a full 128 K of memory (as much as the Macintosh), etc. All you would really need to add to have a nice home-office setup was a monitor and a printer.

The Apple IIc

The Apple IIc

But the IIc was not envisioned just as a more practical machine: as the only II model after the first with which Steve Jobs played an important role, it evinced all of his famous obsession with design. Indeed, much of the external look and sensibility that we associate with Apple today begins as much here as with the just slightly older — and, truth be told, just slightly clunkier-looking — first Macintosh model. The Apple IIc was the first product of what would turn into a longstanding partnership with the German firm Frog Design. It marks the debut of what Apple referred to as the “Snow White” design language — slim, modern, sleek, and, yes, white. Everything about the IIc, including the packaging and the glossy manuals inside, oozed the same chic elegance.

Apple introduced the IIc at a lavish party and exhibition in San Francisco’s Moscone Center in April of 1984, just three months after a similar shindig to launch the Macintosh. The name was chosen to mollify restless Apple II owners who feared — rightly so, as it would turn out; even at “Apple II Forever” Jobs made time for a presentation on “The First 100 Days of Macintosh” — that Sculley, Jobs, and their associates had little further interest in them. Geniuses that they have always been for burnishing their own myths, Apple built a museum right there in the conference center, its centerpiece a replica of the garage where it had all begun. The IIc unveiling itself was an audiovisual extravaganza featuring three huge projection screens for the music video Apple had commissioned for the occasion. The most dramatic and theatrical moment came when Sculley held the tiny machine above him onstage for the first time. As the crowd strained to see, he asked if they’d like a closer look. Then the house lights suddenly came up and every fifth person in the audience stood up with an Apple IIc of her own to show and pass around.

Apple confidently predicted that they would soon be selling 100,000 IIcs every month on the strength of the launch buzz and a $15 million advertising campaign. In actuality the machine averaged just 100,000 sales per year over its four years in Apple’s product catalogs. The old, ugly IIe outsold its fairer sibling handily. This left Apple in a huge bind for a while, for they had all but stopped production of the IIe in anticipation of the IIc’s success while wildly overproducing IIcs for a rush that never materialized. Thus for some time stores were glutted with the IIcs that consumers didn’t want and couldn’t get their hands on the IIes that they did. (It’s interesting to consider that the PCjr almost certainly sold more units than the IIc, which has never been tarred with the label of outright flop, during each machine’s first year on the market. Narratives can be funny things.)

It remains even today somewhat unclear why the world never embraced the IIc as it had the three Apple II models that preceded it. There’s some evidence to suggest that consumers, not yet conditioned to expect each new generation of computing technology to be both smaller and more powerful than the previous, took the IIc’s small size to be a sign that it was not as serious or powerful as the IIe. Apple was actually aware of this danger before the IIc debuted. Thus the advertising campaign worked hard to explain that the IIc was more powerful than its size would imply, with the tagline, “Announcing a technological breakthrough of incredible proportions.” Yet it’s doubtful whether this message really got through. In addition, the IIc was, like the PCjr, an expensive proposition for the home-computer buyer: almost $1300, $300 more than a basic IIe. For that price you got twice the memory of the IIe as well as various other IIe add-on options built right in, but the value of all this may have been difficult for the novice buyer, the IIc’s main target, to grasp. She may just have seen that she was being asked to pay more for a smaller and thus presumably less capable machine, and gone with the bigger, more serious-looking IIe (if anything from Apple).

Then again, maybe the IIc was just born under a bad sign. As I’ve already noted, nobody was having much luck with their new home computers in 1984, almost regardless of their individual strengths and weaknesses.

But why was this trend so universal? That’s what people inside the industry and computer evangelists outside it were asking themselves with increasing urgency as the year wore on. As 1984 drew toward a close, the inertia began to affect even the most established warhorses, the Commodore 64 and the Apple IIe. Both Commodore and Apple posted disappointing Christmas numbers, down at least 20% from the year before, and poor Commodore, now effectively a one-product company completely reliant on continuing sales of the 64, sank back well below that magic billion-dollar threshold again. In the grand scheme of things the Commodore 64 was still a ridiculously successful machine, by far the bestselling computer in the world and the preeminent gaming platform of its era. Yet there increasingly seemed to be something wrong with the home-computer revolution as a whole.

Commodore 64 startup screen

The fact was that a backlash had been steadily building almost from the moment that the spectacular Christmas 1983 buying season had ended. Consumers had begun to say, and not without considerable justification, that home computers promised far more than they delivered. Watching all those bright, happy faces in television and print advertising, people had bought computers expecting them to do the things that the computers there were doing. As Commodore’s advertising put it, “If you’re not pleased with what’s on your TV set tonight, simply turn on your Commodore 64.” Yet what did you get when you turned on your 64 — after you figured out how to connect it to your TV in the first place, that is? No bright fun, just something about 38,911 somethings, a READY prompt, and a cryptically blinking cursor. Everything about television was easy; everything about computers was hard. Computers had been sold to consumers like any other piece of consumer electronics, but they were not like any other piece of consumer electronics. For the vast majority of people — those who had no intrinsic fascination with the technology itself, who merely wanted to do the sorts of things those families on TV were doing — they were stubborn, frustrating, well-nigh intractable things. Ordinary consumers were dutifully buying computers, but computers were at some fundamental level not yet ready for ordinary consumers.

The computer industry was still unable to really answer the question which had dogged and thwarted it ever since Radio Shack had run the first ads showing a happy housewife sorting her recipes on a TRS-80 perched on the kitchen table: why do I, the ordinary man or woman with children to feed and a job to keep, need one? Commodore had cemented the industry’s go-to rhetoric with the help of William Shatner in their VIC-20 advertising campaign that first carved out a real market segment for home computers. You needed a computer for productivity tasks and for your children’s future, “Johnny can’t read BASIC” having replaced “Johnny can’t read” as the marker of a neglectful parent. Entertainment was relegated to an asterisk at the end: “Plays great games too!”

Yet, honestly, how productive could you really be with even the Commodore 64, much less the 5 K VIC-20? Some people did manage to do productive things with their 64s, but most of those who did forgot or decided not to ask themselves a simple question: is doing this on the computer really easier than the alternative? The answer was almost always no. Hobbyists chose to do things on the computer because it was cool, not because it was practical. Never mind if it took far more effort to keep one’s address book on the Commodore 64, what with its slow disk drive and quirky, unrefined software, than it would have to just have a paper card file. Never mind if it was much riskier as well, prone to deletion by an errant key swipe or a misbehaving disk drive. It was cooler, and that was all that mattered — to a technology buff. Most other people found it easier to address their Christmas cards by hand than to try to feed envelopes through a tractor-fed dot-matrix printer that made enough noise to wake the neighbors.

Perhaps the one possible compelling productive use of a machine like the Commodore 64 in the home was as a word processor. Kids today can’t imagine how students once despaired when their teachers told them that a report had to be typed back in the era of typewriters, can’t conceive how difficult it was to get anything on paper in typewritten form when every mistake made by untutored fingers meant trying to decide between pulling out the Liquid Paper or just starting all over again. But even word processing on the 64 was made so painful by the 40-column screen and manifold other compromises that there was room to debate whether the cure was worse than the disease. Specialized hardware-based word processors became hugely popular during this era for just this reason. These single-function, all-in-one devices were much more pleasant to use than a Commodore 64 equipped with a $30 program, and cheaper than buying a whole computer system, especially if you went with a higher priced and thus more productively useful model like the Apple II.

The idea that every child in America needed to learn to program, lest she be left behind to flip burgers while her friends had brilliant careers, was also absurd on the face of it. It was akin to declaring during the days of the Model T that every citizen needed to learn to strip down and rebuild one of these newfangled automobiles. Basic computer literacy was important (and remains so today); BASIC literacy was not. What a child really needed to know could largely be taught in school. Parents needn’t have fretted if Junior preferred reading, listening to music, playing sports, or practicing origami to learning the vagaries of PEEKs and POKEs in BASIC 2.0. There would be time enough for computing when computing and Junior had both grown up a bit.

So, everything had changed yet nothing had changed since the halcyon days of the trinity of 1977. Computers were transforming the face and in some cases the very nature of business, yet there remained just two compelling reasons to have one in the home: 1) for the sheer joy of hacking or 2) for playing games. Lots more computers were now being used for the latter than the former, thanks to the vastly more and vastly better games that were now available. But for many folks games just weren’t a compelling enough reason to own one. The Puritan ethic that makes people feel guilty of their pleasures was as strong in America then as it remains today. It certainly didn’t help that the media had been filled for several years now with hand-wringing about the effect videogames were having on the psyches of youngsters. (This prompted many computer publishers of this period to work hard, albeit likely with limited success, to label their computer games as something different, something more cerebral and rewarding and even, dare we say it, educational than their simplistic videogame cousins.)

But, perhaps most of all, computers still remained quite expensive when you really dug into everything you needed for a workable system. Yes, you could get a Commodore 64 for less than $200 by the Christmas of 1983. But then you needed a disk drive ($220) if you wanted to do, well, much of anything with it; a monitor ($220) if you wanted a nice picture and didn’t want to tie up the family television all the time; a printer ($290) for word processing, if you wanted to take that fraught plunge; a modem ($60) to go online. It didn’t take long until you were approaching four digits, and that’s without even entering into a discussion of software. There was thus a certain note of false advertising in that sub-$200 Commodore 64. And because these machines were being sold through mass merchandisers rather than dealers, there was no one who really knew better, who could help buyers to put a proper system together at the point of sale. Consumers, conditioned by pretty much everything else that was sold to them not to expect the 64 on its own to be pretty much useless, were often baffled and frustrated when they realized they had bought an expensive doorstop. Many of the computers sold during that Christmas of 1983 were turned on a few times only, then consigned to the back of the closet or attic to gather dust. The bad taste they put in many people’s mouths would take years to go away. Meanwhile the more complete, useful machines, like the Apple IIc and the PCjr, were still more expensive than a complete Commodore 64 system — and the games on them weren’t as good to boot. Hackers and passionate gamers (or, perhaps more commonly, their generous parents) were willing to pay the price. Curious novices largely were not. Faced with no really good all-purpose options, many — most, actually — soon decided home computers just weren’t worth it. The real home-computer revolution, as it turned out, was still almost ten years away. About 15% of American homes had computers — at least ostensibly; many of them were, as just mentioned, buried in closets — by January 1, 1985, but that figure would rise with agonizing slowness for the rest of the decade. People could still live perfectly happy lives fully plugged into the cultural discourse around them and raise healthy, productive children in the process without owning a computer. Only much later, with the arrival of the World Wide Web and computers equipped with more intuitive graphical user interfaces for accessing it, would that change.

Which is not to say that the software and information industries that had exploded in and around the home-computer revolution during 1982 and 1983 died just like that. Many of its prominent members, however, did, as the financial gambles they had taken in anticipation of the home-computer revolution came back to haunt them. We’ve just seen how Sierra nearly went under during this period. Muse Software and Scott Adams’s Adventure International, to name two other old friends from this blog, weren’t so lucky; both folded in 1985. Electronic Arts survived, but steered their rhetoric and choice of titles somewhat away from Trip Hawkins’s original vision of “consumer software” toward titles tilted more toward the hardcore, in proven hardcore genres like the CRPG and the adventure game.

Magazines were even harder hit. By early 1984 there were more than 300 professionally published computing periodicals of one sort or another, many of them just founded during the boom of the previous year. Well over half of these died during 1984 and 1985. Mixed in with the dead Johnny-come-latelys were some cherished veteran voices, among them pioneers Creative Computing (1974), SoftSide (1978), and Softalk (1980). The latter’s demise, after exactly four years and 48 issues of sometimes superb people-focused journalism, came as a particular blow to the Apple II community; Apple historian Steven Weyhrich names this moment as nothing less than the end of the “golden age” of the Apple II. Those magazines that survived often did so in dramatically shrunken form. Compute!, for instance, went from 392 pages in December of 1983 to 160 ten months later.

Yet it wasn’t all doom and gloom. Paradoxically, some software publishers still did quite well. Infocom, for example, had the best single year in their history in 1984 in terms of unit sales, selling almost 750,000 games. It seemed that, with more options than ever before, software buyers were becoming much more discerning. Those publishers like Infocom who could offer them fresh, quality products showing a distinctive sensibility could do very well. Those who could not, like Adventure International with their tired old two-word parsers and simplistic engines, suffered the consequences. That real or implied asterisk (“Plays great games too!”) at the end of the advertising copy remained the great guilty secret of the remaining home-computer industry, the real reason computers were in homes at all. Thankfully, the best games were getting ever more complex and compelling; otherwise the industry may have been in even more trouble than it actually was.

Indeed, with a staggering number of machines already out there and heaps still to be sold for years to come, the golden age for Commodore 64 users was just beginning. This year of chaos and uncertainty was the year that the 64 really came into its own as a games machine, as programmers came to understand how to make it sing. Companies who found these keyboard maestros would be able to make millions from them. The home-computer revolution may not have quite panned out as anticipated and the parent company may have looked increasingly clueless, but for gamers the Commodore 64 stood alone with its combination of audiovisual capability, its large and ever growing catalog of games, and its low price. What with game consoles effectively dead in the wake of Atari’s crash and burn, all the action was right here.

In that spirit, we’ll look next time at the strange transformation that led one of our stodgiest old friends from earlier articles to become the hip purveyor of some of the slickest games that would ever grace the 64.

(The indispensable resources on Commodore’s history remain Brian Bagnall’s On the Edge and its revised edition, Commodore: A Company on the Edge. Frank Rose’s West of Eden is the best chronicle I know of this period of Apple’s history. The editorial pages and columnists in Compute! and Compute!’s Gazette provided a great unfolding account of a chaotic year in home computing as it happened. Particular props must go to Fred D’Ignazio for pointing out all of the problems with the standard rhetoric of the home-computer revolution in Compute!‘s May 1984 issue — but he does lose points for naming the PCjr as the answer to all these woes in the next issue.)

 

Tags: , ,

Business is War

In the 64 Commodore had their potentially world-beating home computer. Now they needed to sell it. Fortunately, Jack Tramiel still had to hand Kit Spencer, the British mastermind behind the PET’s success in his home country and the VIC-20’s in the United States. The pitchman for the latter campaign, William Shatner, was no longer an option to help sell the 64. His contract had run out just as the new machine was released, and his asking price for another go-round had increased beyond what Tramiel was willing to pay in the wake of his hit television series T.J. Hooker and the movie Star Trek II: The Wrath of Khan. Spencer decided to forgo a pitchman entirely in favor of a more direct approach that would hammer on the competition while endlessly repeating those two all-important numbers: 64 K and less than $600. He queued up a major advertising blitz in both print and television for the 1982 Christmas season, the second and final time in their history that Commodore would mount such a concentrated, smart promotional effort.

<>


Effective as it was, the campaign had none of the creativity or easy grace of the best advertising from Apple or IBM. The ads simply regurgitated those two critical numbers over and over in a somewhat numbing fashion, while comparing them with the memory size and price of one or more unfortunate competitors. Surprisingly, there was little mention of the unique graphics and sound capabilities that in the long run would define the Commodore 64 as a platform. It almost seems as if Commodore themselves did not entirely understand the capabilities of the chips that Al Charpentier and Bob Yannes had created for them. Still, Spencer showed no fear of punching above his weight. In addition to the 64’s obvious competitors in the low-end market, he happily went after much more expensive, more business-oriented machines like the Apple II and the IBM PC. Indeed, here those two critical numbers, at least when taken without any further context, favored the 64 even more markedly. The computer industry had never before seen advertising this nakedly aggressive, this determined to name names and call out the competition on their (alleged) failings. It would win Commodore few friends inside the industry. But Tramiel didn’t care; the ads were effective, and that was the important thing.

Commodore took their shots at the likes of Apple and IBM, but the real goal had become ownership of the rapidly emerging low-end — read, “home computer” — market. Tramiel’s competition there were the game consoles and the two other computer makers making a serious mass-market play for the same consumers, Atari and Texas Instruments. For the lower end of the low end, Commodore had the VIC-20; for the higher end, the 64.

Atari 5200

Atari 5200

Atari’s big new product for Christmas 1982 was the 5200, a new console based on the same chipset as their computer designs. (Those chips had originally been designed for a successor to the VCS, but rerouted into full-fledged computers when sales of the current VCS just kept increasing. Thus the process finally came full circle, albeit three years later than expected.) The 5200 was something of a stopgap, a rather panicked product from a company whose management had long since lost interest in engineering innovations. It actually marked Atari’s first major new hardware release in three years. Research and development, you see, had shrunk to virtually nil under the stewardship of CEO Ray Kassar, a former titan of the textile industry who held videogames and his customers in something perilously close to contempt. Despite being based on the same hardware, the 5200 was inexplicably incompatible with cartridges for the existing Atari home computers. Those games that were available at launch were underwhelming, and the 5200 was a major disappointment. Only the VCS — now retroactively renamed the 2600 to account for the new 5200 — continued to sell in good quantities, and those were shrinking steadily. Aside from the 2600 and 5200, Atari had only its two three-year-old computers, the chintzy, little-loved 400 and the impressive but also more expensive 800 with only 48 K of memory. With the latter selling for upwards of $600 and both machines halfheartedly (at best) promoted, the big battles of the conflict that the press would soon dub the “Home Computer Wars” would be fought between TI and Commodore. It would be a disappointing Christmas for Atari, and one which foretold bigger problems soon to come.

Put more personally — and very personal it would quickly become — the Home Computer Wars would be fought between Jack Tramiel and the youthful head of TI’s consumer-products division, William J. Turner. The opening salvo was unleashed shortly before the 64’s introduction by, surprisingly, TI rather than Commodore. At that time the TI-99/4A was selling for about $300, the VIC-20 for $240 to $250. In a move they would eventually come to regret, TI suddenly announced a $100 rebate on the TI-99/4A, bringing the final price of the machine to considerably less than that of the inferior VIC-20. With TI having provided him his Pearl Harbor, Jack Tramiel went to war. On the very same day that Turner had opened hostilities, Tramiel slashed the wholesale price of the VIC-20, bringing the typical retail price down into the neighborhood of $175. Despite this move, consumers chose the TI-99/4A by about a three to one margin that Christmas, obviously judging its superior hardware worth an extra $25 and the delayed gratification of waiting for a rebate check. Some fun advertising featuring Bill Cosby didn’t hurt a bit either, while Commodore’s own contract with William Shatner was now history, leaving little advertising presence for the VIC-20 to complement the big push Spencer was making with the 64. TI sold more than half a million computers in just a few months. Round One: TI.

Of course, the 64 did very well as well, although at almost $600 it sold in nowhere near the quantities it eventually would. In those days, computers were sold through two channels. One was the network of dedicated dealers who had helped to build the industry from the beginning, a group that included chains like Computerland and MicroAge as well as plenty of independent shops. A more recent outlet were the so-called mass merchandisers — discounters like K-Mart and Toys ‘R’ Us that lived by stacking ’em deep and selling ’em cheap, with none of the knowledge and support to be found at the dealers. Commodore and TI had been the first to begin selling their computers through mass merchandisers. Here Tramiel and Turner shared the same vision, seeing these low-end computers as consumer electronics rather than tools for hobbyists or businessmen — a marked departure from the attitude of, say, Apple. It really wasn’t possible for a computer to be successful in both distribution models. As soon as it was released to the merchandisers, the game was up for the dealers, as customers would happily come to them to get all of their questions answered, then go make the actual purchase at the big, splashy store around the corner. Commodore’s dealers had had a hard time of it for years, suffering through the limited success of the PET line in the American market only to see Commodore pass its first major sales success there, the VIC-20, to the mass merchandisers. They were understandably desperate to have the 64. Cheap as it was for its capabilities, it still represented much more of an investment than the VIC-20. Surely buyers would want to take advantage of the expertise of a real dealer. Tramiel agreed, or at least claimed to. But then, just as the Christmas season was closing, he suddenly started shipping the 64 to the mass merchandisers as well. Dealers wondering what had happened were left with only the parable of the scorpion and the frog for solace. What could Jack say? It was just his nature. By the following spring the street price of a Commodore 64 had dropped below $400, and it could be found on the shelves of every K-Mart and Toys ‘R’ Us in the country.

With the Commodore 64 joining the VIC-20 in the trenches, Christmas 1982 was looking like only the opening skirmish. 1983 was the year when the Home Computer Wars would peak. This was also the year of the Great Videogame Crash, when the market for Atari 2600 hardware and software went into free fall. In one year’s time Atari went from being the darling of Wall Street to a potentially deadly anchor — hemorrhaging millions of dollars and complete with a disgraced CEO under investigation for insider trading — for a Warner Communications that was suddenly desperate to get rid of it before it pulled the whole corporation down. Just as some had been predicting the previous year, home computers moved in to displace some of the vacuum left by the 2600’s sudden collapse.

Atari 1200XL

Atari 1200XL

In a desperate attempt to field a counterargument to the 64, Atari rushed into production early in 1983 their first new computer since introducing the 400 and 800 more than three years before. Thanks to a bank-switching scheme similar to that of the 64, the Atari 1200XL matched that machine’s 64 K of memory. Unfortunately, it was in almost every other respect a disaster. Atari made the 1200XL a “closed box” design, with none of the expansion possibilities that had made the 800 a favorite of hackers. They used new video hardware that was supposed to be better than the old, but instead yielded a fuzzier display on most monitors and televisions. Worst of all, the changes made to accommodate the extra memory made the new machine incompatible with a whole swathe of software written for the older machines, including many of the games that drove home-computer sales. An apocryphal story has sales of the Atari 800 dramatically increasing in the wake of the 1200XL’s release, as potential buyers who had been sitting on the fence rushed to buy the older machine out of fear it would soon be cancelled and leave them no option but the white elephant that was the 1200XL.

Whatever the truth of such stories, sales for the Atari computer line as a whole continued to lag far behind those of Commodore and TI, and far behind what would be needed to keep Atari a viable concern in this new world order. Huge as Atari (briefly) was, they had no chip-making facilities of their own. Instead, their products were full of MOS chips amongst others. Not only were both their console and computer lines built around the 6502, but MOS manufactured many of the game cartridges for the 2600 and 5200. Thus even when Commodore lost by seeing a potential customer choose an Atari over one of their own machines they still won in the sense that the Atari machine was built using some of their chips — chips for which Atari had to pay them.

Atari would largely be collateral damage in the Home Computer Wars. As I remarked before, however, it was personal between Tramiel and TI. You may remember that almost ten years before these events Commodore had been a thriving maker of calculators and digital watches. TI had entered those markets along with Japanese companies with devices built entirely from their own chips, which allowed them to dramatically undercut Commodore’s prices and very nearly force them out of business. Only the acquisition of MOS Technologies and the PET had saved Commodore. Now Tramiel, who never forgot a slight much less a full-on assault, could smell payback. Thanks to MOS, Commodore were now also able to make for themselves virtually all of the chips found in the VIC-20 and the 64, with the exception only of the memory chips. TI’s recent actions would seem to indicate that they thought they could drive Commodore out of the computer market just as they had driven them out of the watch and calculator markets. But this time, with both companies almost fully vertically integrated, things would be different. Bill Turner’s colossal mistake was to build his promotional campaign for the TI-99/4A entirely around price, failing to note that it was not just much cheaper than the 64 but also much more capable than the VIC-20. As it was, no matter how low Turner went, Tramiel could always go lower, because the VIC-20 was a much simpler, cheaper design to manufacture. If the Home Computer Wars were going to be all about the price tag, Turner was destined to lose.

The TI-99/4A also had another huge weakness, one ironically connected with what TI touted as its biggest strength outside of its price: its reliance on “Solid State Software,” or cartridges. Producing cartridges for sale required vastly more resources than did distributing software on cassettes or floppy disks, and at any rate TI was determined to strangle any nascent independent software market for their machine in favor of cornering this lucrative revenue stream for their own line of cartridges. They closely guarded the secrets of the machine’s design, and threatened any third-party developers who managed to write something for the platform with law suits if they failed to go through TI’s own licensing program. Those who entered said program would be rewarded with a handsome 10 percent of their software’s profits. Thus the TI-99/4A lacked the variety of software — by which I mainly mean games, the guilty pleasure that really drove the home-computer market — that existed for the VIC-20 and, soon, the 64. Although this wasn’t such an obvious concern for ordinary consumers, the TI-99/4A was thus also largely bereft of the do-it-yourself hacker spirit that marked most of the early computing platforms. (Radio Shack was already paying similarly dearly for policies on their TRS-80 line that were nowhere near as draconian as those of TI.) This meant far less innovation, far less interesting stuff to do with the TI-99/4A.

Early in 1983, Commodore slashed the wholesale price of the VIC-20 yet again; soon it was available for $139 at K-Mart. TI’s cuts in response brought the street price of the TI-99/4A down to about $150. But now they found to their horror that the tables were turned. TI now sat at the break-even point, yet Commodore was able to cut the price of the VIC-20 yet further, while also pummeling them from above with the powerful 64, whose price was plunging even more quickly than that of the VIC-20. TI was reduced to using the TI-99/4A as a loss leader. They would just break even on the computer, but would hopefully make their profits on the cartridges they also sold for it. That can be a good strategy in the right situation; for instance, in our own time it’s helped Amazon remake the face of publishing in a matter of a few years with their Kindle e-readers. But it’s dependent on having stuff that people want to buy from you after you sell them the loss leader. TI did not; the software they had to sell was mostly unimpressive in both quality and variety compared to that available for the developer-friendly Commodore machines. And the price of those Commodore machines just kept dropping, putting TI deeper and deeper into a hole as they kept struggling to match. Soon just breaking even on each TI-99/4A was only a beautiful memory.

By September the price of a 64 at a big-box discount store was less than $200, the VIC-20 about $80. Bill Turner had already been let go in disgrace. Now a desperate TI was selling the TI-99/4A at far below their own cost to make them, even as Commodore was continuing to make a modest profit on every unit sold thanks to continuous efforts to reduce production costs. At last, on October 28, 1983, TI announced that it was pulling out of the PC market altogether, having lost a stunning half a billion dollars on the venture to that point in 1983 and gutted their share prices. The TI-99/4A had gone from world beater to fiasco in barely nine months; Turner from visionary to scapegoat in less. As a parting shot, TI dumped the rest of their huge unsold inventory of TI-99/4As onto the market, where at a street price of $50 or less they managed to cause a final bit of chaos for everyone left competing in the space.

But this Kamikaze measure was the worst they could do. Jack Tramiel had his revenge. He had beaten Bill Turner, paid him back with interest for 1982. More importantly, he had beaten his old nemesis TI, delivering an embarrassment and a financial ache from which it would take them a long time to recover. With the battlefield all but cleared, 1983 turned into the Christmas of the Commodore 64. By year’s end sales were ticking merrily past the 2-million-unit mark. Even with all the discounting, North American sales revenue on Commodore’s hardware for 1983 more than doubled from that of 1982. A few non-contenders like the Coleco Adam and second-stringers like Atari’s persistent computer line aside, the Home Computer Wars were over. When their MOS chip-making division and their worldwide sales were taken into account, Commodore was now bigger than Apple, bigger than anyone left standing in the PC market with the exception only of IBM and Radio Shack, both of whose PC divisions accounted for only a small part of their total revenue. The 64 had also surpassed the Apple II as simply the computer to own if you really liked games, while also filling the gap left by the imploded Atari VCS market and, increasingly as the price dropped, the low-end home-computer market previously owned by the VIC-20 and TI-99/4A. Thanks to the Commodore 64, computer games were going big time. Love the platform and its parent company or hate them (and plenty did the latter, not least due to Tramiel’s instinct for the double cross that showed itself in more anecdotes than I can possibly relate on this blog), everybody in entertainment software had to reckon with them. Thanks largely to Commodore and TI’s price war, computer use exploded in the United States between 1982 and 1984. In late 1982, Compute!, a magazine pitched to the ordinary consumer with a low-cost home computer, had a monthly circulation of 100,000. Eighteen months later it was over 500,000. The idea of 500,000 people who not only owned PCs but were serious enough about them to buy a magazine dedicated to the subject would have sounded absurd at the time that the Commodore 64 was launched. And Compute! was just one piece of an exploding ecosystem.

Yet even at this, the supreme pinnacle of Tramiel’s long career in business, there was a whiff of the Pyrrhic in the air as the battlefield cleared. The 64 had barely made it out the door before five of its six principal engineers, the men who had put together such a brilliant little machine on such a shoestring, left Commodore. Among them were both Al Charpentier, designer of its VIC-II graphics chip, and Bob Yannes, designer of its SID sound chip. The problems had begun when Tramiel refused to pay the team the bonuses they had expected upon completing the 64; his justification was that putting the machine together had taken them six months rather than the requested three. They got worse when Tramiel refused to let them start working on a higher-end follow-up to the 64 that would offer 80-column text, a better disk system and a better BASIC, and could directly challenge the likes of the Apple II and IBM PC. And they reached a breaking point when Tramiel decided not to give them pay raises when review time came, even though some of the junior engineers, like Yannes, were barely making a subsistence living.

The five engineers left to start a company of their own. For a first project, they contracted with Atari to produce My First Computer, a product which would, via a membrane keyboard and a BASIC implementation on cartridge, turn the aged VCS into a real, if extremely limited, computer for children to learn with. Tramiel, who wielded lawyers like cudgels and seemed to regard his employees as indentured servants at best, buried the fledgling start-up in lawsuits. By the time they managed to dig themselves out, the VCS was a distant memory. Perhaps for the best in the long run: three of the engineers, including Charpentier and Yannes, formed Ensoniq to pursue Yannes’s love of electronic music. They established a stellar reputation for their synthesizers and samplers and eventually for a line of sound cards for computers which were for years the choice of the discriminating audiophile. Commodore, meanwhile, was left wondering just who was going to craft the follow-up to the 64, just as they had wondered how they would replace Chuck Peddle after Tramiel drove him away in a similar hail of legal action.

Tramiel also inexplicably soured on Kit Spencer, mastermind of the both the VIC-20 and the 64’s public roll-out, although he only sidelined him into all but meaningless bureaucratic roles rather than fire and/or sue him. Commodore’s advertising would never again be remotely as effective as it had been during the Spencer era. And in a move that attracted little notice at the time, Tramiel cut ties with Commodore’s few remaining dealers in late 1983. From now on the company would live or die with the mass merchandisers. For better or worse, Commodore was, at least in North America, now every bit a mass-market consumer-electronics company. The name “Commodore Business Machines” was truly a misnomer now, as the remnants of the business-oriented line that had begun with the original PET were left to languish and die. In later years, when they tried to build a proper support network for a more expensive machine called the Amiga, their actions of 1982 and 1983 would come back to haunt them. Few dealers would have any desire to get in bed with them again.

In January of 1984 things would get even stranger for this company that never could seem to win for long before a sort of institutionalized entropy pulled them sideways again. But we’ll save that story for later. Next time we’ll look at what Apple was doing in the midst of all this chaos.

(I highly recommend Joseph Nocera’s article in the April 1984 Texas Monthly for a look at the Home Computer Wars from the losers’ perspective.)

 
16 Comments

Posted by on December 20, 2012 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

The Commodore 64

As I described in my last article, many people were beginning to feel that change was in the air as they observed the field of videogame consoles and the emerging market for home computers during the middle part of 1982. If a full-fledged computer was to take the place of the Atari VCS in the hearts of America’s youth, which of the plethora of available machines would it be? IBM had confidently expected theirs to become the one platform to rule them all, but the IBM PC was not gaining the same traction in the home that it was enjoying in business, thanks to an extremely high price and lackluster graphics. Apple was still the media darling, but the only logical contender they could offer for the segment, the Apple II Plus, was looking increasingly aged. Its graphics capabilities, so remarkable for existing at all back in 1977, had barely been upgraded since, and weren’t really up to the sort of colorful action games the kids demanded. Nor was its relatively high price doing it any favors. Another contender was the Atari 400/800 line. Although introduced back in late 1979, these machines still had amongst the best graphics and sound capabilities on the market. On the other hand, the 400 model, with its horrid membrane keyboard, was cost-reduced almost to the point of unusability, while the 800 was, once again, just a tad on the expensive side. And Atari itself, still riding the tidal wave that was the VCS, showed little obvious interest in improving or promoting this tiny chunk of its business. Then of course there was Radio Shack, but no one — including them — seemed to know just what they were trying to accomplish with a pile of incompatible machines of wildly different specifications and prices all labeled “TRS-80.” And there was the Commodore VIC-20 which had validated for many people the whole category of home computer in the first place. Its price was certainly right, but it was just too limited to have long legs.

The TI-99/4A. Note the prominent port for "Solid State Software" to the right of the keyboard.

The TI-99/4A. Note the prominent port for “Solid State Software” to the right of the keyboard.

The most obvious contender came from an unexpected quarter. Back in early 1980, the electronics giant Texas Instruments had released a microcomputer called the TI-99/4. Built around a CPU of TI’s own design, it was actually the first 16-bit machine to hit the market. It had a lot of potential, but also a lot of flaws and oddities to go with its expensive price, and went nowhere. Over a year later, in June of 1981, TI tried again with an updated version, the TI-99/4A. The new model had just 16 K of RAM, but TI claimed more was not necessary. Instead of using cassettes or floppy disks, they sold software on cartridges, a technique they called “Solid State Software.” Since the programs would reside in the ROM of the cartridge, they didn’t need to be loaded into RAM; that needed to be used only for the data the programs manipulated. The idea had some real advantages. Programs loaded instantly and reliably, something that couldn’t be said for many other storage techniques, and left the user to fiddle with fragile tapes or disks only to load and save her data files. This just felt more like the way a consumer-electronics device ought to work to many people — no typing arcane commands and then waiting and hoping, just pop a cartridge in and turn the thing on. The TI-99/4A also had spectacularly good graphics, featuring sprites, little objects that were independent of the rest of the screen and could be moved about with very little effort on the part of the computer or its programmer. They were ideal for implementing action games; in a game of Pac-Man, for instance, the title character and each of the ghosts would be implemented as a sprite. Of the other contenders, only the Atari 400 and 800 offered sprites — as well as, tellingly, all of the game consoles. Indeed, they were considered something of a necessity for a really first-rate gaming system. With these virtues plus a list price of just $525, the TI-99/4A was a major hit right out of the gate, selling in numbers to rival the even cheaper but much less capable VIC-20. It would peak at the end of 1982 with a rather extraordinary (if brief-lived) 35 percent market share, and would eventually sell in the neighborhood of 2.5 million units.

With the TI-99/4A so hot that summer of 1982, the one wildcard — the one obstacle to anointing it the king of home computers — was a new machine just about to ship from Commodore. It was called the Commodore 64, and it would change everything. Its story had begun the previous year with a pair of chips.

In January of 1981 some of the engineers at Commodore’s chipmaking subsidiary, MOS Technologies, found themselves without a whole lot to do. The PET line had no major advancements in the immediate offing, and the VIC-20’s design was complete (and already released in Japan, for that matter). Ideally they would have been working on a 16-bit replacement for the 6502, but Jack Tramiel was uninterested in funding such an expensive and complicated project, a choice that stands as amongst the stupidest of a veritable encyclopedia of stupidity written by Commodore management over the company’s chaotic life. With that idea a nonstarter, the engineers hit upon a more modest project: to design a new set of graphics and sound chips that would dramatically exceed the capabilities of the VIC-20 and (ideally) anything else on the market. Al Charpentier would make a graphics chips to be called the VIC-II, the successor to the VIC chip that gave the VIC-20 its name. Bob Yannes would make a sound synthesizer on a chip, the Sound Interface Device (SID). They took the idea to Tramiel, who gave them permission to go ahead, as long as they didn’t spend too much.

In deciding what the VIC-II should be, Charpentier looked at the graphics capabilities of all of the computers and game machines currently available, settling on three as the most impressive, and thus the ones critical to meet or exceed: the Atari 400 and 800, the Mattel Intellivision console, and the soon-to-be-released TI-99/4A. Like all of these machines, the VIC-II chip would have to have sprites. In fact, Charpentier spent the bulk of his time on them, coming up with a very impressive design that allowed up to eight onscreen sprites in multiple colors. (Actually, as with so many features of the VIC-II and the SID, this was only the beginning. Clever programmers would quickly come up with ways to reuse the same sprite objects, thus getting even more moving objects on the screen.) For the display behind the sprites, Charpentier created a variety of character-based and bitmapped modes, with palettes of up to 16 colors at resolutions of up to 320 X 200. On balance, the final design did indeed exceed or at least match the aggregate capabilities of anything else on the market. It offered fewer colors than the Atari’s 128, for example, but a much better sprite system; fewer total sprites (without trickery) than the TI-99/4A’s 32, but bigger and more colorful ones, and with about the same background display capabilities.

If the VIC-II was an evolutionary step for Commodore, the SID was a revolution in PC and videogame sound. Bob Yannes, just 24 years old, had been fascinated by electronic sound for much of his life, devouring early electronica records like those by Kraftwerk and building simple analog synthesizers from kits in his garage. Hired by MOS right out of university in 1978, he felt like he had been waiting all his employment for just this project. An amateur musician himself, he was appalled by the sound chips that other engineers thought exceptional, like that in the Atari 400 and 800. From a 1985 IEEE Spectrum article on the making of the Commodore 64:

The major differences between his chip and the typical videogame sound chips, Yannes explained, were its more precise frequency control and its independent envelope generators for shaping the intensity of a sound. “With most of the sound effects in games, there is either full volume or no volume at all. That really makes music impossible. There’s no way to simulate the sound of any instrument even vaguely with that kind of envelope, except maybe an organ.”

Although it is theoretically possible to use the volume controls on other sound chips to shape the envelope of a sound, very few programmers had ever tackled such a complex task. To make sound shaping easy, Yannes put the envelope controls in hardware: one register for each voice to determine how quickly a sound builds up; two to determine the level at which the note is sustained and how fast it reaches that level; and one to determine how fast the note dies away. “It took a long time for people to understand this,” he conceded.

But programmers would come to understand it in the end, and the result would be a whole new dimension to games and computer art. The SID was indeed nothing short of a full-fledged synthesizer on a chip. With three independent voices to hand, its capabilities in the hands of the skilled are amazing; the best SID compositions still sound great today. Games had beeped and exploded and occasionally even talked for years. Now, however, the emotional palette game designers had to paint on would expand dramatically. The SID would let them express deep emotions through sound and (especially) music, from stately glory to the pangs of romantic love, from joy to grief.

In November of 1981 the MOS engineers brought their two chips, completed at last, to Tramiel to find out what he’d like to do with them. He decided that they should put them into a successor to the VIC-20, to be tentatively titled the VIC-40. In the midst of this discussion, it emerged that the MOS engineers had one more trick up their sleeves: a new variant of the 6502 called the 6510 which offered an easy way to build an 8-bit computer with more than 48 K of RAM by using a technique called bank switching.

Let’s stop here for just a moment to consider why this should have been an issue at all. Both the Zilog Z80 and the MOS 6502 CPUs that predominated among early PCs are 8-bit chips with 16-bit address buses. The latter number is the one that concerns us right now; it means that the CPU is capable of addressing up to 64 K of memory. So why the 48 K restriction? you might be asking. Well, you have to remember that a computer does not only address RAM; there is also the need for ROM. In the 8-bit machines, the ROM usually contains a BASIC-based operating environment along with a few other essentials like the glyphs used to form characters on the screen. All of this usually consumes about 16 K, leaving 48 K of the CPU’s address space to be mapped to RAM. With the arrival of the 48 K Apple II Plus in 1979, the industry largely settled on this as both the practical limit for a Z80- or 6502-based machine and the configuration that marked a really serious, capable PC. There were some outliers, such as Apple’s Language Card that let a II Plus be expanded to 64 K of RAM by dumping BASIC entirely in lieu of a Pascal environment loaded from disk, but the 48 K limit was largely accepted as just a fact of life for most applications.

With the 6510, however, the MOS engineers added some circuitry to the 6502 to make it easy to swap pieces of the address space between two (or more) alternatives. Below is an illustration of the memory of the eventual Commodore 64.

Commodore 64 memory map

Ignoring the I/O block as out of scope for this little exercise, let’s walk through this. First we have 1 K of RAM used as a working space to hold temporary values and the like (i.e., the program stack). Then 1 K is devoted to storing the current contents of the screen. Next comes the biggest chunk, 38 K for actual BASIC programs. Then 8 K of ROM, which stores the BASIC language itself. Then comes another 4 K of “high RAM” that’s gotten trapped behind the BASIC ROM; this is normally inaccessible to the BASIC programmer unless she knows some advanced techniques to get at it. Then 4 K of ROM to hold the glyphs for the standard onscreen character set. Finally, 8 K of kernel, storing routines for essential functions like reading the keyboard or interacting with cassette or disk drives. All of this would seem to add up to a 44 K RAM system, with only 40 K of it easily accessible. But notice that each piece of ROM has RAM “underneath” it. Thanks to the special circuitry on the 6510, a programmer can swap RAM for ROM if she likes. Programming in assembly language rather than BASIC? Swap out the BASIC ROM, and get another 8 K of RAM, plus easy, contiguous access to that high block of another 4 K. Working with graphics instead of words, or would prefer to define your own font? Swap out the character ROM. Taking over the machine entirely, and thus not making so much use of the built-in kernel routines? Swap the kernel for another 8 K of RAM, and maybe just swap it back in from time to time when you want to actually use something there.

Commodore 64 startup screen

The above will hopefully answer the most common first question of a new Commodore 64 user, past or present: Why does my “64 K RAM system” say it has only 38 K free for BASIC? The rest of the memory is there, but only for those who know how to get at it and who are willing to forgo the conveniences of BASIC. I should emphasize here that the concept of bank switching was hardly an invention of the MOS engineers; it’s a fairly obvious approach, after all. Apple had already used the technique to pack a full 128 K of RAM into a 6502-based computer of their own, the failed Apple III (about which more in the very near future). The Apple III, however, was an expensive machine targeted at businesses and professionals. The Commodore 64 was the first to bring the technique to the ordinary consumer market. Soon it would be everywhere, giving the venerable 6502 and Z80 new leases on life.

Jack Tramiel wasn’t a terribly technical fellow, and likely didn’t entirely understand what an extra 16 K of memory would be good for in the first place. But he knew a marketing coup when he saw one. Thus the specifications of the new machine were set: a 64 K system built around MOS’s three recent innovations — the 6510, the VIC-II, and the SID. The result should be cheap enough to produce that Commodore could sell it for less than $600. Oh, and please have a prototype ready for the January 1982 Winter CES show, less than two months away.

With so little time and such harsh restrictions on production costs, Charpentier, Yannes, and the rest of their team put together the most minimalist design they could to bind those essential components together. They even managed to get enough of it done to have something to show at Winter CES, where the “VIC-40” was greeted with excitement on the show floor but polite skepticism in the press. Commodore, you see, had a well-earned reputation, dating from the days when the PET was the first of the trinity of 1977 to be announced and shown but the last to actually ship, for over-promising at events like these and delivering late or not at all. Yet when Commodore showed the machine again in June at the Summer CES — much more polished, renamed the Commodore 64 to emphasize what Tramiel and Commodore’s marketing department saw as its trump card, and still promised for less than $600 — they had to start paying major attention. Days later it started shipping. The new machine was virtually indistinguishable from the VIC-20 in external appearance because Commodore hadn’t been willing to spend the time or money to design a new case.

The Commodore 64

The Commodore 64

Inside it was one hell of a machine for the money, although not without its share of flaws that a little more time, money, and attention to detail during the design process could have easily corrected.

The BASIC housed in its ROM (“BASIC 2.0”) was painfully antiquated. It was actually the same BASIC that Tramiel had bought from Microsoft for the original PET back in 1977. Bill Gates, in a rare display of naivete, sold him the software outright for a flat fee of $10,000, figuring Commodore would have to come back soon for another, better version. He obviously didn’t know Jack Tramiel very well. Ironically, Commodore did have on hand a better BASIC 4.0 they had used in some of the later PET models, but Tramiel nixed using it in the Commodore 64 because it would require a more expensive 16 K rather than 8 K of ROM chips to house. People were already getting a lot for their money, he reasoned. Why should they expect a decent BASIC as well? The Commodore 64’s BASIC was not only primitive, but completely lacked commands to actually harness the machine’s groundbreaking audiovisual capabilities; graphics and sound could be accomplished in BASIC only by using “peek” and “poke” commands to access registers and memory locations directly, an extremely awkward, inefficient, and ugly way of programming. If the memory restrictions on BASIC weren’t enough to convince would-be game programmers to learn assembly language, this certainly did. The Commodore 64’s horrendous BASIC likely accelerated an already ongoing flight from the language amongst commercial game developers. For the rest of the 1980s, game development and assembly language would go hand in hand.

Due to a whole combination of factors — including miscommunication among marketing, engineering, and manufacturing, an ultimately pointless desire to be hardware compatible with the VIC-20, component problems, cost-cutting, and the sheer rush of putting a product together in such a limited time frame — the Commodore 64 ended up saddled with a disk system that would become, even more than the primitive BASIC, the albatross around the platform’s neck. It’s easily the slowest floppy-disk system ever sold commercially, on the order of thirty times slower than Steve Wozniak’s masterpiece, the Apple II’s Disk II system. Interacting with disks from BASIC 2.0, which was written before disk drives existed on PCs, requires almost as much patience as does waiting for a program to load. For instance, you have to type “LOAD ‘$’, 8” followed by ‘LIST’ just to get a directory listing. As an added bonus, doing so wipes out any BASIC program you might have happened to have in memory.

The disk system’s flaws frustrate because they dissipate a lot of potential strengths. Commodore had had a unique approach to disk drives ever since producing their first for the PET line circa 1979. A Commodore disk drive is a smart device, containing its own 6502 CPU as well as ROM and 2 K of RAM. The DOS used on other computers like the Apple II to tell the computer how to control the drive, manage the filesystem, etc., is unnecessary on a Commodore machine. The drive can control itself very well, thank you very much; it already knows all about that stuff. This brings some notable advantages. No separate DOS has to be loaded into the computer’s RAM, eating precious memory. DOS 3.3, for example, the standard on the Apple II Plus at the time of the Commodore 64’s introduction, eats up more than 10 K of the machine’s precious 48 K of RAM. Thus the Commodore 64’s memory edge was in practical terms even more significant than it appeared on paper. Because it’s possible to write small programs for the drive’s CPU to process and load them into the drive’s RAM, the whole system was a delight for hackers. One favorite trick was to load a disk-copying program into a pair of drives, then physically disconnect them from the computer. They would continue happily copying disks on their own, as long as the user kept putting more disks in. More practically for average users, it was often possible for games to play music or display animated graphics while simultaneously loading from the drive. Other computers’ CPU were usually too busy controlling the drive to manage this. Of course, this was a very good feature for this particular computer, because Commodore 64 users would be spending a whole lot more time than users of other computers waiting for their disk drives to load their programs.

Quality-control issues plagued the entire Commodore 64 line, especially in the first couple of years. One early reviewer had to return two machines before Commodore shipped him one that worked; some early shipments to stores were allegedly 80 percent dead on arrival. To go with all of their other problems, the disk drives were particularly unreliable. In one early issue, Compute!’s Gazette magazine stated that four of the seven drives in their offices were currently dead. The poor BASIC and unfriendly operating environment, the atrocious disk system, and the quality-control issues, combined with no option for getting the 80-column display considered essential for word processing and much other business software, kept the Commodore 64 from being considered seriously by most businesses as an alternative to the Apple II or IBM PC. Third-party solutions did address many of the problems. Various improved BASICs were released as plug-in cartridges, and various companies rewrote the systems software to improve transfer speeds by a factor of six or more. But businesses wanted machines that just worked for them out of the box, which Apple and IBM largely gave them while Commodore did not.

None of that mattered much to Commodore, at least for now, because they were soon selling all of the Commodore 64s they could make for use in homes. No, it wasn’t a perfect machine, not even with its low price (and dropping virtually by the month), its luxurious 64 K of memory, its versatile graphics, and its marvelous SID chip. But, like the Sinclair Spectrum that was debuting almost simultaneously in Britain, it was the perfect machine for this historical moment. Also like the Spectrum, it heralded a new era in its home country, where people would play — and make — games in numbers that dwarfed what had come before. For a few brief years, the premiere mainstream gaming platform in the United States would be a full-fledged computer rather than a console — the only time, before or since, that that has happened. We’ll talk more about the process that led there next time.

(As you might expect, much of this article is drawn from Brian Bagnall’s essential history of Commodore. The IEEE Spectrum article referenced above was also a gold mine.)

 
26 Comments

Posted by on December 17, 2012 in Digital Antiquaria, Interactive Fiction

 

Tags: ,