RSS

Monthly Archives: December 2012

Shiny and Exciting vs. Dull and Boring

Apple Computer has remained quite consistent about a number of things throughout all the years of their existence. One of the most striking of these is their complete disinterest in competing on the basis of price. The implicit claim has always been that Apple products are special, made for special people, even as the number of “special people” who buy them have paradoxically in recent years rivaled the numbers of the more plebeian sorts who favor the likes of Microsoft. As such, it’s only just and right that you have to pay a bit of a premium for them. That’s of course a policy every company would have if they could get away with it. The remarkable thing about Apple is that they’ve been able to do just that for several decades now. Partly that success comes down to the sheer dumb luck of coming along at the right moment to be anointed the face of the PC revolution, with an all-American story of entrepreneurship and a charismatic and photogenic young founder that the mainstream media couldn’t help but love. Partly it comes down to a series of products that reflected a real vision for the potentialities of computing as opposed to just offering more MB and MHz than what had come out the previous year. But, perhaps most of all, the world believed Apple was special because Apple believed Apple was special, with a sincere certainty that no PR firm could have faked. At its worst, this quality makes the company and the cult of fanatically loyal users that surrounds it insufferable in their smug insularity. At its best, it lets them change the world in ways that even their perpetual rival Microsoft never quite managed. Bill Gates had a practical instinct for knowing where the computer industry was going and how he could best position Microsoft to ride the wave, but Steve Jobs had a vision for how computer technology could transform life as ordinary people knew it. The conflict between this utopian vision, rooted in Apple’s DNA just as it was in that of Jobs the unrepentant California dreamer himself, and the reality of Apple’s prices that have historically limited its realization to a relatively affluent elite is just one of a number of contradictions and internal conflicts that make Apple such a fascinating study. Some observers perceive Apple as going through something of an identity crisis in this new post-Jobs era, with advertising that isn’t as sharp and products that don’t quite stand out like they used to. Maybe Apple, for thirty years the plucky indie band that belonged to the cool kids and the tastemakers, isn’t sure how to behave now that they’re on a major label and being bought by everyone. Their products are now increasingly regarded as just products, without the aura of specialness that insulated them from direct comparison on the strict basis of price and features for so long.

But that’s now. When the Home Computer Wars got started in earnest in 1982, Apple adopted a policy of, as a contemporaneous Compute! editorial put it, “completely ignoring the low-end market,” positioning themselves as blissfully above such petty concerns as price/feature comparison charts. Even for a company without Apple’s uniquely Teflon reputation that wasn’t necessarily a bad policy to follow. As Texas Instruments and Atari would soon learn, no one had a prayer of beating Jack Tramiel’s vertically-integrated Commodore in a price war. At the same time, though, the new Commodore 64 represented in its 64 K of memory a significant raising of the bar for any everyday, general-purpose PC. In that sense Apple did have to respond, loath as they would have been to publicly admit that the likes of Commodore could have any impact on their strategy. The result, the Apple IIe, was the product not only of the changes wrought to the industry by the Commodore 64 but also of the last several chaotic years inside Apple’s Cupertino, California, headquarters.

Apple’s superb public image could somewhat belie the fact that through 1982 they had managed to release exactly one really successful computer, the Apple II Plus of 1979. Of the machines prior to the II Plus, the Apple I (1976) had been a mere hobbyist endeavor assembled by hand in Jobs’s garage, and had sold in about the quantities you might expect for such a project; the original Apple II (1977) an impressive proof of concept that was overshadowed by Radio Shack’s cheaper, better distributed TRS-80. It was the 48 K II Plus — mated to Steve Wozniak’s last great feat of minimalist engineering, the Disk II system, and with the PC industry’s first killer app, VisiCalc, in tow — that made Apple. After it… well, there’s quite a tale.

Mike Markkula

Mike Markkula

Michael Scott

Michael Scott

It’s often forgotten that Apple’s early history isn’t just the story of the two kids who started the company in a garage. Right after the two Steves came the two Mikes. Apple’s third and fourth employees, Mike Markkula and Michael Scott, were both older Silicon Valley insiders with money and resumes listing companies like Intel and Fairchild — about as Establishment as you could get. With his role of visionary-in-chief and public spokesman not yet clearly defined, Jobs was the odd man out among the original four, bereft of both Woz’s technical genius and the connections, deep pockets, and business acumen of Markkula and Scott. Tellingly, when Scott issued identification badges for the first time he made Woz, the architect of all of the company’s projects so far and presumed key to their future success, Employee #1, until Jobs’s endless whining convinced him to make him Employee #0 as a compromise. Markkula and Scott managed in remarkably short order to institute a very Establishment bureaucratic structure inside the offices of the supposedly freewheeling Apple. Jobs bounced about the org chart, in Frank Rose’s words “like a feral child” (like the young Bill Gates, the young Steve Jobs was not always big on personal hygiene), leaving chaos in his wake.

Within Apple the II Plus was regarded as the logical end of the road for the Apple II, the ultimate maturation of Woz’s original template. By the time it was released all three of the others were souring on Woz himself. Markkula and Scott found Woz’s lone-wolf style of high-wire engineering to be incompatible with their growing business and its associated bureaucracy, while Jobs resented Woz’s status as the father of the Apple II and desperately wanted to prove himself by creating something of his own, without Woz’s involvement. And so the three came to a tacit agreement to ease Woz aside. Woz himself, whose native guilelessness and naivete are so extreme they can seem almost disingenuous to more cynical souls, simply let it happen without seeming to even realize what was going on. It was decided to divide Apple’s research and development into two projects. One, codenamed Sara, would be a practical attempt to create a logical successor to the Apple II that would be more expensive but also more appealing to businesses and high-end users. The other, codenamed Lisa, would be built around a powerful new processor just released by Motorola, the 68000. As a long-term, blue-sky project, Lisa’s design and ultimate goals would remain quite amorphous for quite some time while most of Apple’s day-to-day effort went into Sara.

The Sara would be designed around the same MOS 6502 CPU as the Apple II, but clocked to 1.8 MHz rather than 1 MHz. A bank-switching scheme would let the machine ship with a full 128 K of memory, with more expansion possible. The graphics would be a dramatic improvement on the Apple II, with higher resolutions and none of the weird color restrictions that could be such a challenge for programmers. The standard text display would now show the 80 columns so critical to word processing and many other business tasks. And replacing the simple ROM-housed BASIC environment of the Apple II would be a more sophisticated operating system booted off of disk. To make sure everyone got the message about this latter, they would even call it the “Sophisticated Operating System,” or, rather unfortunately in light of later developments, SOS. (In one of the more infuriating examples of the endemic cutesiness that has often afflicted Apple, they insisted that “SOS” be pronounced as “Applesauce,” but at least on this occasion most people ended up having none of it.)

It all sounded like a great plan on paper. However, the new group of engineers hired to replace Woz soon had their hands full dealing with Jobs, whose vision for Sara seemed to change by the day. Determined to make Sara “his” machine, he demanded things that were often incompatible with the realities of computer technology circa 1979. The new machine must have a certain external size and shape, leaving the engineers to cram chips into spaces with inadequate ventilation. When they requested an internal fan, Jobs vetoed the idea; he wanted a sleek, elegant, quiet device, not something that sounded like a piece of industrial equipment. In addition to dealing with Jobs, the engineers were also put under enormous pressure to finish Sara, now to be called the Apple III, before Apple’s IPO in December of 1980. Cutting corners all the way, they managed to start shipping the machine just a few weeks before. In retrospect it was a lucky break that they cut it so close, because it meant that reports of what a mess the Apple III was hadn’t filtered up in enough strength or quantity by the time of the IPO to affect it.

The Apple III

The Apple III

Just as the engineers had feared, Apple IIIs started overheating over long use. As they got too hot, chips would expand and come out of their sockets. Apple’s customer-support people were soon reduced to asking angry callers to please pick up their new $6000 wonder-computers and drop them back onto the desk in the hope that this would re-seat the chips. In addition to the overheating, a whole host of other problems cropped up, from shoddy cables to corroding circuit boards. Apple struggled to deal with the problems for months, repairing or replacing tens of thousands of machines. Unfortunately, the replacements were often as unreliable as the originals. Disgusted by the fiasco and by the company culture he felt had led to it (by which he meant largely Markkula and Jobs), Michael Scott quit Apple in July of 1981. With Woz having recognized there was little role left for him at Apple and having started on the first of a series of extended leaves of absence, two of the four original partners were effectively now gone. Far from claiming the Apple III as his equivalent to Woz’s baby the Apple II, Jobs quickly distanced himself from those left struggling to turn this particular sow’s ear into a silk purse. Randy Wigginton, Employee #6, put the situation colorfully: “The Apple III was kind of like a baby conceived during a group orgy, and [later] everybody had this bad headache and there’s this bastard child, and everyone says, ‘It’s not mine.'” Jobs moved on to the Lisa project, which was now really ramping up and becoming Apple’s big priority, not to mention his own great hope for making his mark. When he wore out his welcome there, he moved on yet again, to a new project called Macintosh.

In the end the Apple III remained in production until 1984 despite selling barely 100,000 units in total over that period. Within Apple it came to be regarded as a colossal missed opportunity which could have sewn up the business market for them before IBM entered with their own PC. That strikes me as doubtful. While the Apple III was unquestionably one of the most powerful 6502-based computers ever made on paper and gradually grew to be quite a nice little machine in real life as the remaining dogged engineers gradually solved its various problems, the 8088-based IBM PC was superior in processing power if nothing else, as well as having the IBM logo on its case. The Apple III was certainly damaging to Apple financially, but it could have been worse. Apple’s position as press favorites and the continuing success of the II Plus insulated them surprisingly well; relatively few lost faith in that special Apple magic.

Indeed, as Apple poured money into the Apple III and now into the ever more grandiose Lisa project, the II Plus just kept on selling, long after Apple’s own projections had it fading away. The fact is that the only thing that sustained Apple through this period of flailing about was the II Plus. If Apple’s own projections about its sales had been correct, there wouldn’t be an Apple still around today. Luckily, users simply refused to let it go, and their evangelization of the machine and the many third-party hardware and software makers who supported it kept it selling incredibly strongly even when Apple themselves seemed that they could hardly care less about it. Thanks to Woz’s open design, they didn’t have to; the machine seemed almost infinitely malleable and expandable, an open canvas for others to paint upon. Far from appreciating their loyalty or even simply Apple’s good fortune, Jobs grew more and more frustrated at the Apple II’s refusal to die. He called those parts of the company still dedicated to the line the “dull and boring” divisions, the people who worked in them “Clydesdales” because they were so slow and plodding. He delighted in wandering over to the remnants of the Apple II engineering department (now stuck away in an ugly corner of Apple’s growing campus) to tell everyone how shit all of their products were, seemingly oblivious to the fact that those same products were the only things funding his grander vision of computing.

Dull and boring or not, by 1982 it was becoming obvious that Apple was tempting fate by continuing to leave their only commercially viable computer unimproved. Despite the flexibility of its basic design, some core aspects of the machine were now almost absurdly primitive. For instance, it still had no native support for lower-case letters. And now the much cheaper Commodore 64 was about to be released with a number of features that put the old II Plus to shame. At virtually any moment sales could collapse as users flocked at last to the 64 or something else, taking Apple as a company down with them. Apple may not have been interested in price wars, but it was past time that they reward Apple II users with some sort of sop for their loyalty. An engineer named Walt Brodener turned down a spot on Jobs’s growing Macintosh team to make a next-generation Apple II, out of love of the platform and a desire to sustain the legacy of Woz.

Back in 1980, Brodener and Woz had worked on a project to dramatically reduce the production cost of the II Plus by squeezing the whole design into vastly fewer chips. But then management, caught in the grip of Apple III fever, had cancelled it. The Apple II line, they had reasoned, wouldn’t sell for long enough that the savings would justify the cost of tooling up for a whole new design. Brodener now started again from the old plan he’d put together with Woz, but two more years of ever improving chip-making technology let him take it much further. In the end he and his team reduced the chip count from 120 to an incredible 31, while significantly increasing the computer’s capabilities. Using a bank-switching scheme even more intricate than that of the Commodore 64, they boosted standard memory to 64 K, while also making it possible to further expand the machine to 128 K or (eventually) more. An 80-column text display was now standard, with lower-case letters also making an appearance at long last. Just as the machine entered production, Brodener and his team realized that the changes they had made to implement 80-column text would also allow a new, higher resolution graphics mode with only a few minor changes to the motherboard. Accordingly, all but the first batch of machines shipped with a “double-hi-res” mode. It came with even more caveats and color restrictions than the standard hi-res mode, but it ran at a remarkable 560 X 192. Finally, the new design was also cooler and more reliable. Brodener and team managed all of this while still keeping the new machine 99.9% compatible with software written for the old.

The Apple IIe

The Apple IIe

Known as the “Super II” during its development phase, Apple eventually settled on calling the new machine the IIe, as in “Enhanced.” Jobs predictably hated it, finding its big, chunky case with its removable top and row of hacker-friendly expansion slots anathema to his own ideas of computers as elegantly closed designs that were complete in themselves. He sniped at the IIe from its genesis right through its commercial debut and for years afterward. He had plenty of time to do so, because the IIe proved to be a spectacular success right from its release in January of 1983. At last Apple had a true successor to the II Plus, albeit in the form they had least expected and, truth be told, least desired. The Lisa, meanwhile, which shipped at last some six months after the IIe, turned into yet another major disappointment, not recouping a fraction of its huge development cost. Much to Jobs’s chagrin, it seemed that Apple simply couldn’t field a successful product unless it had a “II” on the case.

But what a success the IIe turned out to be, a veritable dream product for any company. Despite its much reduced manufacturing cost, Apple didn’t reduce the new machine’s selling price at all. Why should they, if people were still happy to buy at the old prices? They sold the IIe to distributors at three times their cost to make them, an all but unheard of margin of profit. The machine itself may have reflected little of Jobs’s sensibility, but the price at which Apple sold it was right in line with one of his core business philosophies, as Woz relayed in an anecdote from the earliest days of their friendship:

“Steve had worked in surplus electronics and said if you can buy a part for 30 cents and sell it to this guy at the surplus store for $6, you don’t have to tell him what you paid for it. It’s worth $6 to the guy. And that was his philosophy of running a business.”

There are a couple of ways to look at the outsize disparity between the IIe’s production cost and its selling price. One might see it as borderline unethical, a cynical fleecing of consumers who didn’t know any better and a marked contrast to Commodore’s drive to open up computing for everyone by selling machines that were in some ways inferior but in some ways better for less and less money. On the other hand, all of that seemingly unearned windfall didn’t disappear into thin air. Most of it was rather plowed back into Apple’s ambitious research-and-development projects that finally did change the world. (Not alone, and not to the degree Apple’s own rhetoric tried to advertise, but credit where it’s due.) Jack Tramiel, a man who saw computers as essentially elaborate calculators to be sold on the basis of price and bullet lists of technical specifications, would have been as incapable of articulating such a vision as he would have been of conceiving it in the first place. If nothing else, the IIe shows how good it is to have a cozy relationship with the press and an excellent public image, things Apple mostly enjoyed even in its worst years and Commodore rarely did even in its best. They make many people willing to pay a premium for your stuff and not ask too many questions in the process.

The Apple IIe sold like crazy, a cash cow of staggering proportions. By Christmas of 1983 IIe sales were already approaching the half-million mark. Sales then doubled for 1984, its best year; in that year alone they approached one million units. It would remain in production (with various minor variations) until November of 1993, the most long-lived single product in Apple’s corporate history. During much of that period it continued to largely sustain Apple as they struggled to get Lisa and later, after abandoning that as a lost cause, Macintosh off the ground.

In more immediate terms, the arrival of the Apple IIe also allowed the fruition of a trend in games begun by the Commodore 64. By the end of 1983, with not only the 64 and the IIe but also new Atari models introduced with 64 K of memory, that figure was already becoming a baseline expectation. A jump from 48 K to 64 K may seem a relatively modest one, but it allowed for that much more richness, that much more complexity, and went a long way toward enabling more ambitious concepts that began to emerge as the early 1980s turned into the mid-1980s.

(Unlike the relatively under-served Commodore, there is a wealth of published material on the history of Apple and its internal confusion during this period. Two of the best are West of Eden by Frank Rose and Apple Confidential by Owen Linzmayer. Among other merits, both give an unvarnished picture of what an absolute pain in the ass the young Steve Jobs could be.)

 
12 Comments

Posted by on December 30, 2012 in Digital Antiquaria, Interactive Fiction

 

Tags:

Business is War

In the 64 Commodore had their potentially world-beating home computer. Now they needed to sell it. Fortunately, Jack Tramiel still had to hand Kit Spencer, the British mastermind behind the PET’s success in his home country and the VIC-20’s in the United States. The pitchman for the latter campaign, William Shatner, was no longer an option to help sell the 64. His contract had run out just as the new machine was released, and his asking price for another go-round had increased beyond what Tramiel was willing to pay in the wake of his hit television series T.J. Hooker and the movie Star Trek II: The Wrath of Khan. Spencer decided to forgo a pitchman entirely in favor of a more direct approach that would hammer on the competition while endlessly repeating those two all-important numbers: 64 K and less than $600. He queued up a major advertising blitz in both print and television for the 1982 Christmas season, the second and final time in their history that Commodore would mount such a concentrated, smart promotional effort.

<>


Effective as it was, the campaign had none of the creativity or easy grace of the best advertising from Apple or IBM. The ads simply regurgitated those two critical numbers over and over in a somewhat numbing fashion, while comparing them with the memory size and price of one or more unfortunate competitors. Surprisingly, there was little mention of the unique graphics and sound capabilities that in the long run would define the Commodore 64 as a platform. It almost seems as if Commodore themselves did not entirely understand the capabilities of the chips that Al Charpentier and Bob Yannes had created for them. Still, Spencer showed no fear of punching above his weight. In addition to the 64’s obvious competitors in the low-end market, he happily went after much more expensive, more business-oriented machines like the Apple II and the IBM PC. Indeed, here those two critical numbers, at least when taken without any further context, favored the 64 even more markedly. The computer industry had never before seen advertising this nakedly aggressive, this determined to name names and call out the competition on their (alleged) failings. It would win Commodore few friends inside the industry. But Tramiel didn’t care; the ads were effective, and that was the important thing.

Commodore took their shots at the likes of Apple and IBM, but the real goal had become ownership of the rapidly emerging low-end — read, “home computer” — market. Tramiel’s competition there were the game consoles and the two other computer makers making a serious mass-market play for the same consumers, Atari and Texas Instruments. For the lower end of the low end, Commodore had the VIC-20; for the higher end, the 64.

Atari 5200

Atari 5200

Atari’s big new product for Christmas 1982 was the 5200, a new console based on the same chipset as their computer designs. (Those chips had originally been designed for a successor to the VCS, but rerouted into full-fledged computers when sales of the current VCS just kept increasing. Thus the process finally came full circle, albeit three years later than expected.) The 5200 was something of a stopgap, a rather panicked product from a company whose management had long since lost interest in engineering innovations. It actually marked Atari’s first major new hardware release in three years. Research and development, you see, had shrunk to virtually nil under the stewardship of CEO Ray Kassar, a former titan of the textile industry who held videogames and his customers in something perilously close to contempt. Despite being based on the same hardware, the 5200 was inexplicably incompatible with cartridges for the existing Atari home computers. Those games that were available at launch were underwhelming, and the 5200 was a major disappointment. Only the VCS — now retroactively renamed the 2600 to account for the new 5200 — continued to sell in good quantities, and those were shrinking steadily. Aside from the 2600 and 5200, Atari had only its two three-year-old computers, the chintzy, little-loved 400 and the impressive but also more expensive 800 with only 48 K of memory. With the latter selling for upwards of $600 and both machines halfheartedly (at best) promoted, the big battles of the conflict that the press would soon dub the “Home Computer Wars” would be fought between TI and Commodore. It would be a disappointing Christmas for Atari, and one which foretold bigger problems soon to come.

Put more personally — and very personal it would quickly become — the Home Computer Wars would be fought between Jack Tramiel and the youthful head of TI’s consumer-products division, William J. Turner. The opening salvo was unleashed shortly before the 64’s introduction by, surprisingly, TI rather than Commodore. At that time the TI-99/4A was selling for about $300, the VIC-20 for $240 to $250. In a move they would eventually come to regret, TI suddenly announced a $100 rebate on the TI-99/4A, bringing the final price of the machine to considerably less than that of the inferior VIC-20. With TI having provided him his Pearl Harbor, Jack Tramiel went to war. On the very same day that Turner had opened hostilities, Tramiel slashed the wholesale price of the VIC-20, bringing the typical retail price down into the neighborhood of $175. Despite this move, consumers chose the TI-99/4A by about a three to one margin that Christmas, obviously judging its superior hardware worth an extra $25 and the delayed gratification of waiting for a rebate check. Some fun advertising featuring Bill Cosby didn’t hurt a bit either, while Commodore’s own contract with William Shatner was now history, leaving little advertising presence for the VIC-20 to complement the big push Spencer was making with the 64. TI sold more than half a million computers in just a few months. Round One: TI.

Of course, the 64 did very well as well, although at almost $600 it sold in nowhere near the quantities it eventually would. In those days, computers were sold through two channels. One was the network of dedicated dealers who had helped to build the industry from the beginning, a group that included chains like Computerland and MicroAge as well as plenty of independent shops. A more recent outlet were the so-called mass merchandisers — discounters like K-Mart and Toys ‘R’ Us that lived by stacking ’em deep and selling ’em cheap, with none of the knowledge and support to be found at the dealers. Commodore and TI had been the first to begin selling their computers through mass merchandisers. Here Tramiel and Turner shared the same vision, seeing these low-end computers as consumer electronics rather than tools for hobbyists or businessmen — a marked departure from the attitude of, say, Apple. It really wasn’t possible for a computer to be successful in both distribution models. As soon as it was released to the merchandisers, the game was up for the dealers, as customers would happily come to them to get all of their questions answered, then go make the actual purchase at the big, splashy store around the corner. Commodore’s dealers had had a hard time of it for years, suffering through the limited success of the PET line in the American market only to see Commodore pass its first major sales success there, the VIC-20, to the mass merchandisers. They were understandably desperate to have the 64. Cheap as it was for its capabilities, it still represented much more of an investment than the VIC-20. Surely buyers would want to take advantage of the expertise of a real dealer. Tramiel agreed, or at least claimed to. But then, just as the Christmas season was closing, he suddenly started shipping the 64 to the mass merchandisers as well. Dealers wondering what had happened were left with only the parable of the scorpion and the frog for solace. What could Jack say? It was just his nature. By the following spring the street price of a Commodore 64 had dropped below $400, and it could be found on the shelves of every K-Mart and Toys ‘R’ Us in the country.

With the Commodore 64 joining the VIC-20 in the trenches, Christmas 1982 was looking like only the opening skirmish. 1983 was the year when the Home Computer Wars would peak. This was also the year of the Great Videogame Crash, when the market for Atari 2600 hardware and software went into free fall. In one year’s time Atari went from being the darling of Wall Street to a potentially deadly anchor — hemorrhaging millions of dollars and complete with a disgraced CEO under investigation for insider trading — for a Warner Communications that was suddenly desperate to get rid of it before it pulled the whole corporation down. Just as some had been predicting the previous year, home computers moved in to displace some of the vacuum left by the 2600’s sudden collapse.

Atari 1200XL

Atari 1200XL

In a desperate attempt to field a counterargument to the 64, Atari rushed into production early in 1983 their first new computer since introducing the 400 and 800 more than three years before. Thanks to a bank-switching scheme similar to that of the 64, the Atari 1200XL matched that machine’s 64 K of memory. Unfortunately, it was in almost every other respect a disaster. Atari made the 1200XL a “closed box” design, with none of the expansion possibilities that had made the 800 a favorite of hackers. They used new video hardware that was supposed to be better than the old, but instead yielded a fuzzier display on most monitors and televisions. Worst of all, the changes made to accommodate the extra memory made the new machine incompatible with a whole swathe of software written for the older machines, including many of the games that drove home-computer sales. An apocryphal story has sales of the Atari 800 dramatically increasing in the wake of the 1200XL’s release, as potential buyers who had been sitting on the fence rushed to buy the older machine out of fear it would soon be cancelled and leave them no option but the white elephant that was the 1200XL.

Whatever the truth of such stories, sales for the Atari computer line as a whole continued to lag far behind those of Commodore and TI, and far behind what would be needed to keep Atari a viable concern in this new world order. Huge as Atari (briefly) was, they had no chip-making facilities of their own. Instead, their products were full of MOS chips amongst others. Not only were both their console and computer lines built around the 6502, but MOS manufactured many of the game cartridges for the 2600 and 5200. Thus even when Commodore lost by seeing a potential customer choose an Atari over one of their own machines they still won in the sense that the Atari machine was built using some of their chips — chips for which Atari had to pay them.

Atari would largely be collateral damage in the Home Computer Wars. As I remarked before, however, it was personal between Tramiel and TI. You may remember that almost ten years before these events Commodore had been a thriving maker of calculators and digital watches. TI had entered those markets along with Japanese companies with devices built entirely from their own chips, which allowed them to dramatically undercut Commodore’s prices and very nearly force them out of business. Only the acquisition of MOS Technologies and the PET had saved Commodore. Now Tramiel, who never forgot a slight much less a full-on assault, could smell payback. Thanks to MOS, Commodore were now also able to make for themselves virtually all of the chips found in the VIC-20 and the 64, with the exception only of the memory chips. TI’s recent actions would seem to indicate that they thought they could drive Commodore out of the computer market just as they had driven them out of the watch and calculator markets. But this time, with both companies almost fully vertically integrated, things would be different. Bill Turner’s colossal mistake was to build his promotional campaign for the TI-99/4A entirely around price, failing to note that it was not just much cheaper than the 64 but also much more capable than the VIC-20. As it was, no matter how low Turner went, Tramiel could always go lower, because the VIC-20 was a much simpler, cheaper design to manufacture. If the Home Computer Wars were going to be all about the price tag, Turner was destined to lose.

The TI-99/4A also had another huge weakness, one ironically connected with what TI touted as its biggest strength outside of its price: its reliance on “Solid State Software,” or cartridges. Producing cartridges for sale required vastly more resources than did distributing software on cassettes or floppy disks, and at any rate TI was determined to strangle any nascent independent software market for their machine in favor of cornering this lucrative revenue stream for their own line of cartridges. They closely guarded the secrets of the machine’s design, and threatened any third-party developers who managed to write something for the platform with law suits if they failed to go through TI’s own licensing program. Those who entered said program would be rewarded with a handsome 10 percent of their software’s profits. Thus the TI-99/4A lacked the variety of software — by which I mainly mean games, the guilty pleasure that really drove the home-computer market — that existed for the VIC-20 and, soon, the 64. Although this wasn’t such an obvious concern for ordinary consumers, the TI-99/4A was thus also largely bereft of the do-it-yourself hacker spirit that marked most of the early computing platforms. (Radio Shack was already paying similarly dearly for policies on their TRS-80 line that were nowhere near as draconian as those of TI.) This meant far less innovation, far less interesting stuff to do with the TI-99/4A.

Early in 1983, Commodore slashed the wholesale price of the VIC-20 yet again; soon it was available for $139 at K-Mart. TI’s cuts in response brought the street price of the TI-99/4A down to about $150. But now they found to their horror that the tables were turned. TI now sat at the break-even point, yet Commodore was able to cut the price of the VIC-20 yet further, while also pummeling them from above with the powerful 64, whose price was plunging even more quickly than that of the VIC-20. TI was reduced to using the TI-99/4A as a loss leader. They would just break even on the computer, but would hopefully make their profits on the cartridges they also sold for it. That can be a good strategy in the right situation; for instance, in our own time it’s helped Amazon remake the face of publishing in a matter of a few years with their Kindle e-readers. But it’s dependent on having stuff that people want to buy from you after you sell them the loss leader. TI did not; the software they had to sell was mostly unimpressive in both quality and variety compared to that available for the developer-friendly Commodore machines. And the price of those Commodore machines just kept dropping, putting TI deeper and deeper into a hole as they kept struggling to match. Soon just breaking even on each TI-99/4A was only a beautiful memory.

By September the price of a 64 at a big-box discount store was less than $200, the VIC-20 about $80. Bill Turner had already been let go in disgrace. Now a desperate TI was selling the TI-99/4A at far below their own cost to make them, even as Commodore was continuing to make a modest profit on every unit sold thanks to continuous efforts to reduce production costs. At last, on October 28, 1983, TI announced that it was pulling out of the PC market altogether, having lost a stunning half a billion dollars on the venture to that point in 1983 and gutted their share prices. The TI-99/4A had gone from world beater to fiasco in barely nine months; Turner from visionary to scapegoat in less. As a parting shot, TI dumped the rest of their huge unsold inventory of TI-99/4As onto the market, where at a street price of $50 or less they managed to cause a final bit of chaos for everyone left competing in the space.

But this Kamikaze measure was the worst they could do. Jack Tramiel had his revenge. He had beaten Bill Turner, paid him back with interest for 1982. More importantly, he had beaten his old nemesis TI, delivering an embarrassment and a financial ache from which it would take them a long time to recover. With the battlefield all but cleared, 1983 turned into the Christmas of the Commodore 64. By year’s end sales were ticking merrily past the 2-million-unit mark. Even with all the discounting, North American sales revenue on Commodore’s hardware for 1983 more than doubled from that of 1982. A few non-contenders like the Coleco Adam and second-stringers like Atari’s persistent computer line aside, the Home Computer Wars were over. When their MOS chip-making division and their worldwide sales were taken into account, Commodore was now bigger than Apple, bigger than anyone left standing in the PC market with the exception only of IBM and Radio Shack, both of whose PC divisions accounted for only a small part of their total revenue. The 64 had also surpassed the Apple II as simply the computer to own if you really liked games, while also filling the gap left by the imploded Atari VCS market and, increasingly as the price dropped, the low-end home-computer market previously owned by the VIC-20 and TI-99/4A. Thanks to the Commodore 64, computer games were going big time. Love the platform and its parent company or hate them (and plenty did the latter, not least due to Tramiel’s instinct for the double cross that showed itself in more anecdotes than I can possibly relate on this blog), everybody in entertainment software had to reckon with them. Thanks largely to Commodore and TI’s price war, computer use exploded in the United States between 1982 and 1984. In late 1982, Compute!, a magazine pitched to the ordinary consumer with a low-cost home computer, had a monthly circulation of 100,000. Eighteen months later it was over 500,000. The idea of 500,000 people who not only owned PCs but were serious enough about them to buy a magazine dedicated to the subject would have sounded absurd at the time that the Commodore 64 was launched. And Compute! was just one piece of an exploding ecosystem.

Yet even at this, the supreme pinnacle of Tramiel’s long career in business, there was a whiff of the Pyrrhic in the air as the battlefield cleared. The 64 had barely made it out the door before five of its six principal engineers, the men who had put together such a brilliant little machine on such a shoestring, left Commodore. Among them were both Al Charpentier, designer of its VIC-II graphics chip, and Bob Yannes, designer of its SID sound chip. The problems had begun when Tramiel refused to pay the team the bonuses they had expected upon completing the 64; his justification was that putting the machine together had taken them six months rather than the requested three. They got worse when Tramiel refused to let them start working on a higher-end follow-up to the 64 that would offer 80-column text, a better disk system and a better BASIC, and could directly challenge the likes of the Apple II and IBM PC. And they reached a breaking point when Tramiel decided not to give them pay raises when review time came, even though some of the junior engineers, like Yannes, were barely making a subsistence living.

The five engineers left to start a company of their own. For a first project, they contracted with Atari to produce My First Computer, a product which would, via a membrane keyboard and a BASIC implementation on cartridge, turn the aged VCS into a real, if extremely limited, computer for children to learn with. Tramiel, who wielded lawyers like cudgels and seemed to regard his employees as indentured servants at best, buried the fledgling start-up in lawsuits. By the time they managed to dig themselves out, the VCS was a distant memory. Perhaps for the best in the long run: three of the engineers, including Charpentier and Yannes, formed Ensoniq to pursue Yannes’s love of electronic music. They established a stellar reputation for their synthesizers and samplers and eventually for a line of sound cards for computers which were for years the choice of the discriminating audiophile. Commodore, meanwhile, was left wondering just who was going to craft the follow-up to the 64, just as they had wondered how they would replace Chuck Peddle after Tramiel drove him away in a similar hail of legal action.

Tramiel also inexplicably soured on Kit Spencer, mastermind of the both the VIC-20 and the 64’s public roll-out, although he only sidelined him into all but meaningless bureaucratic roles rather than fire and/or sue him. Commodore’s advertising would never again be remotely as effective as it had been during the Spencer era. And in a move that attracted little notice at the time, Tramiel cut ties with Commodore’s few remaining dealers in late 1983. From now on the company would live or die with the mass merchandisers. For better or worse, Commodore was, at least in North America, now every bit a mass-market consumer-electronics company. The name “Commodore Business Machines” was truly a misnomer now, as the remnants of the business-oriented line that had begun with the original PET were left to languish and die. In later years, when they tried to build a proper support network for a more expensive machine called the Amiga, their actions of 1982 and 1983 would come back to haunt them. Few dealers would have any desire to get in bed with them again.

In January of 1984 things would get even stranger for this company that never could seem to win for long before a sort of institutionalized entropy pulled them sideways again. But we’ll save that story for later. Next time we’ll look at what Apple was doing in the midst of all this chaos.

(I highly recommend Joseph Nocera’s article in the April 1984 Texas Monthly for a look at the Home Computer Wars from the losers’ perspective.)

 
17 Comments

Posted by on December 20, 2012 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

The Commodore 64

As I described in my last article, many people were beginning to feel that change was in the air as they observed the field of videogame consoles and the emerging market for home computers during the middle part of 1982. If a full-fledged computer was to take the place of the Atari VCS in the hearts of America’s youth, which of the plethora of available machines would it be? IBM had confidently expected theirs to become the one platform to rule them all, but the IBM PC was not gaining the same traction in the home that it was enjoying in business, thanks to an extremely high price and lackluster graphics. Apple was still the media darling, but the only logical contender they could offer for the segment, the Apple II Plus, was looking increasingly aged. Its graphics capabilities, so remarkable for existing at all back in 1977, had barely been upgraded since, and weren’t really up to the sort of colorful action games the kids demanded. Nor was its relatively high price doing it any favors. Another contender was the Atari 400/800 line. Although introduced back in late 1979, these machines still had amongst the best graphics and sound capabilities on the market. On the other hand, the 400 model, with its horrid membrane keyboard, was cost-reduced almost to the point of unusability, while the 800 was, once again, just a tad on the expensive side. And Atari itself, still riding the tidal wave that was the VCS, showed little obvious interest in improving or promoting this tiny chunk of its business. Then of course there was Radio Shack, but no one — including them — seemed to know just what they were trying to accomplish with a pile of incompatible machines of wildly different specifications and prices all labeled “TRS-80.” And there was the Commodore VIC-20 which had validated for many people the whole category of home computer in the first place. Its price was certainly right, but it was just too limited to have long legs.

The TI-99/4A. Note the prominent port for "Solid State Software" to the right of the keyboard.

The TI-99/4A. Note the prominent port for “Solid State Software” to the right of the keyboard.

The most obvious contender came from an unexpected quarter. Back in early 1980, the electronics giant Texas Instruments had released a microcomputer called the TI-99/4. Built around a CPU of TI’s own design, it was actually the first 16-bit machine to hit the market. It had a lot of potential, but also a lot of flaws and oddities to go with its expensive price, and went nowhere. Over a year later, in June of 1981, TI tried again with an updated version, the TI-99/4A. The new model had just 16 K of RAM, but TI claimed more was not necessary. Instead of using cassettes or floppy disks, they sold software on cartridges, a technique they called “Solid State Software.” Since the programs would reside in the ROM of the cartridge, they didn’t need to be loaded into RAM; that needed to be used only for the data the programs manipulated. The idea had some real advantages. Programs loaded instantly and reliably, something that couldn’t be said for many other storage techniques, and left the user to fiddle with fragile tapes or disks only to load and save her data files. This just felt more like the way a consumer-electronics device ought to work to many people — no typing arcane commands and then waiting and hoping, just pop a cartridge in and turn the thing on. The TI-99/4A also had spectacularly good graphics, featuring sprites, little objects that were independent of the rest of the screen and could be moved about with very little effort on the part of the computer or its programmer. They were ideal for implementing action games; in a game of Pac-Man, for instance, the title character and each of the ghosts would be implemented as a sprite. Of the other contenders, only the Atari 400 and 800 offered sprites — as well as, tellingly, all of the game consoles. Indeed, they were considered something of a necessity for a really first-rate gaming system. With these virtues plus a list price of just $525, the TI-99/4A was a major hit right out of the gate, selling in numbers to rival the even cheaper but much less capable VIC-20. It would peak at the end of 1982 with a rather extraordinary (if brief-lived) 35 percent market share, and would eventually sell in the neighborhood of 2.5 million units.

With the TI-99/4A so hot that summer of 1982, the one wildcard — the one obstacle to anointing it the king of home computers — was a new machine just about to ship from Commodore. It was called the Commodore 64, and it would change everything. Its story had begun the previous year with a pair of chips.

In January of 1981 some of the engineers at Commodore’s chipmaking subsidiary, MOS Technologies, found themselves without a whole lot to do. The PET line had no major advancements in the immediate offing, and the VIC-20’s design was complete (and already released in Japan, for that matter). Ideally they would have been working on a 16-bit replacement for the 6502, but Jack Tramiel was uninterested in funding such an expensive and complicated project, a choice that stands as amongst the stupidest of a veritable encyclopedia of stupidity written by Commodore management over the company’s chaotic life. With that idea a nonstarter, the engineers hit upon a more modest project: to design a new set of graphics and sound chips that would dramatically exceed the capabilities of the VIC-20 and (ideally) anything else on the market. Al Charpentier would make a graphics chips to be called the VIC-II, the successor to the VIC chip that gave the VIC-20 its name. Bob Yannes would make a sound synthesizer on a chip, the Sound Interface Device (SID). They took the idea to Tramiel, who gave them permission to go ahead, as long as they didn’t spend too much.

In deciding what the VIC-II should be, Charpentier looked at the graphics capabilities of all of the computers and game machines currently available, settling on three as the most impressive, and thus the ones critical to meet or exceed: the Atari 400 and 800, the Mattel Intellivision console, and the soon-to-be-released TI-99/4A. Like all of these machines, the VIC-II chip would have to have sprites. In fact, Charpentier spent the bulk of his time on them, coming up with a very impressive design that allowed up to eight onscreen sprites in multiple colors. (Actually, as with so many features of the VIC-II and the SID, this was only the beginning. Clever programmers would quickly come up with ways to reuse the same sprite objects, thus getting even more moving objects on the screen.) For the display behind the sprites, Charpentier created a variety of character-based and bitmapped modes, with palettes of up to 16 colors at resolutions of up to 320 X 200. On balance, the final design did indeed exceed or at least match the aggregate capabilities of anything else on the market. It offered fewer colors than the Atari’s 128, for example, but a much better sprite system; fewer total sprites (without trickery) than the TI-99/4A’s 32, but bigger and more colorful ones, and with about the same background display capabilities.

If the VIC-II was an evolutionary step for Commodore, the SID was a revolution in PC and videogame sound. Bob Yannes, just 24 years old, had been fascinated by electronic sound for much of his life, devouring early electronica records like those by Kraftwerk and building simple analog synthesizers from kits in his garage. Hired by MOS right out of university in 1978, he felt like he had been waiting all his employment for just this project. An amateur musician himself, he was appalled by the sound chips that other engineers thought exceptional, like that in the Atari 400 and 800. From a 1985 IEEE Spectrum article on the making of the Commodore 64:

The major differences between his chip and the typical videogame sound chips, Yannes explained, were its more precise frequency control and its independent envelope generators for shaping the intensity of a sound. “With most of the sound effects in games, there is either full volume or no volume at all. That really makes music impossible. There’s no way to simulate the sound of any instrument even vaguely with that kind of envelope, except maybe an organ.”

Although it is theoretically possible to use the volume controls on other sound chips to shape the envelope of a sound, very few programmers had ever tackled such a complex task. To make sound shaping easy, Yannes put the envelope controls in hardware: one register for each voice to determine how quickly a sound builds up; two to determine the level at which the note is sustained and how fast it reaches that level; and one to determine how fast the note dies away. “It took a long time for people to understand this,” he conceded.

But programmers would come to understand it in the end, and the result would be a whole new dimension to games and computer art. The SID was indeed nothing short of a full-fledged synthesizer on a chip. With three independent voices to hand, its capabilities in the hands of the skilled are amazing; the best SID compositions still sound great today. Games had beeped and exploded and occasionally even talked for years. Now, however, the emotional palette game designers had to paint on would expand dramatically. The SID would let them express deep emotions through sound and (especially) music, from stately glory to the pangs of romantic love, from joy to grief.

In November of 1981 the MOS engineers brought their two chips, completed at last, to Tramiel to find out what he’d like to do with them. He decided that they should put them into a successor to the VIC-20, to be tentatively titled the VIC-40. In the midst of this discussion, it emerged that the MOS engineers had one more trick up their sleeves: a new variant of the 6502 called the 6510 which offered an easy way to build an 8-bit computer with more than 48 K of RAM by using a technique called bank switching.

Let’s stop here for just a moment to consider why this should have been an issue at all. Both the Zilog Z80 and the MOS 6502 CPUs that predominated among early PCs are 8-bit chips with 16-bit address buses. The latter number is the one that concerns us right now; it means that the CPU is capable of addressing up to 64 K of memory. So why the 48 K restriction? you might be asking. Well, you have to remember that a computer does not only address RAM; there is also the need for ROM. In the 8-bit machines, the ROM usually contains a BASIC-based operating environment along with a few other essentials like the glyphs used to form characters on the screen. All of this usually consumes about 16 K, leaving 48 K of the CPU’s address space to be mapped to RAM. With the arrival of the 48 K Apple II Plus in 1979, the industry largely settled on this as both the practical limit for a Z80- or 6502-based machine and the configuration that marked a really serious, capable PC. There were some outliers, such as Apple’s Language Card that let a II Plus be expanded to 64 K of RAM by dumping BASIC entirely in lieu of a Pascal environment loaded from disk, but the 48 K limit was largely accepted as just a fact of life for most applications.

With the 6510, however, the MOS engineers added some circuitry to the 6502 to make it easy to swap pieces of the address space between two (or more) alternatives. Below is an illustration of the memory of the eventual Commodore 64.

Commodore 64 memory map

Ignoring the I/O block as out of scope for this little exercise, let’s walk through this. First we have 1 K of RAM used as a working space to hold temporary values and the like (i.e., the program stack). Then 1 K is devoted to storing the current contents of the screen. Next comes the biggest chunk, 38 K for actual BASIC programs. Then 8 K of ROM, which stores the BASIC language itself. Then comes another 4 K of “high RAM” that’s gotten trapped behind the BASIC ROM; this is normally inaccessible to the BASIC programmer unless she knows some advanced techniques to get at it. Then 4 K of ROM to hold the glyphs for the standard onscreen character set. Finally, 8 K of kernel, storing routines for essential functions like reading the keyboard or interacting with cassette or disk drives. All of this would seem to add up to a 44 K RAM system, with only 40 K of it easily accessible. But notice that each piece of ROM has RAM “underneath” it. Thanks to the special circuitry on the 6510, a programmer can swap RAM for ROM if she likes. Programming in assembly language rather than BASIC? Swap out the BASIC ROM, and get another 8 K of RAM, plus easy, contiguous access to that high block of another 4 K. Working with graphics instead of words, or would prefer to define your own font? Swap out the character ROM. Taking over the machine entirely, and thus not making so much use of the built-in kernel routines? Swap the kernel for another 8 K of RAM, and maybe just swap it back in from time to time when you want to actually use something there.

Commodore 64 startup screen

The above will hopefully answer the most common first question of a new Commodore 64 user, past or present: Why does my “64 K RAM system” say it has only 38 K free for BASIC? The rest of the memory is there, but only for those who know how to get at it and who are willing to forgo the conveniences of BASIC. I should emphasize here that the concept of bank switching was hardly an invention of the MOS engineers; it’s a fairly obvious approach, after all. Apple had already used the technique to pack a full 128 K of RAM into a 6502-based computer of their own, the failed Apple III (about which more in the very near future). The Apple III, however, was an expensive machine targeted at businesses and professionals. The Commodore 64 was the first to bring the technique to the ordinary consumer market. Soon it would be everywhere, giving the venerable 6502 and Z80 new leases on life.

Jack Tramiel wasn’t a terribly technical fellow, and likely didn’t entirely understand what an extra 16 K of memory would be good for in the first place. But he knew a marketing coup when he saw one. Thus the specifications of the new machine were set: a 64 K system built around MOS’s three recent innovations — the 6510, the VIC-II, and the SID. The result should be cheap enough to produce that Commodore could sell it for less than $600. Oh, and please have a prototype ready for the January 1982 Winter CES show, less than two months away.

With so little time and such harsh restrictions on production costs, Charpentier, Yannes, and the rest of their team put together the most minimalist design they could to bind those essential components together. They even managed to get enough of it done to have something to show at Winter CES, where the “VIC-40” was greeted with excitement on the show floor but polite skepticism in the press. Commodore, you see, had a well-earned reputation, dating from the days when the PET was the first of the trinity of 1977 to be announced and shown but the last to actually ship, for over-promising at events like these and delivering late or not at all. Yet when Commodore showed the machine again in June at the Summer CES — much more polished, renamed the Commodore 64 to emphasize what Tramiel and Commodore’s marketing department saw as its trump card, and still promised for less than $600 — they had to start paying major attention. Days later it started shipping. The new machine was virtually indistinguishable from the VIC-20 in external appearance because Commodore hadn’t been willing to spend the time or money to design a new case.

The Commodore 64

The Commodore 64

Inside it was one hell of a machine for the money, although not without its share of flaws that a little more time, money, and attention to detail during the design process could have easily corrected.

The BASIC housed in its ROM (“BASIC 2.0”) was painfully antiquated. It was actually the same BASIC that Tramiel had bought from Microsoft for the original PET back in 1977. Bill Gates, in a rare display of naivete, sold him the software outright for a flat fee of $10,000, figuring Commodore would have to come back soon for another, better version. He obviously didn’t know Jack Tramiel very well. Ironically, Commodore did have on hand a better BASIC 4.0 they had used in some of the later PET models, but Tramiel nixed using it in the Commodore 64 because it would require a more expensive 16 K rather than 8 K of ROM chips to house. People were already getting a lot for their money, he reasoned. Why should they expect a decent BASIC as well? The Commodore 64’s BASIC was not only primitive, but completely lacked commands to actually harness the machine’s groundbreaking audiovisual capabilities; graphics and sound could be accomplished in BASIC only by using “peek” and “poke” commands to access registers and memory locations directly, an extremely awkward, inefficient, and ugly way of programming. If the memory restrictions on BASIC weren’t enough to convince would-be game programmers to learn assembly language, this certainly did. The Commodore 64’s horrendous BASIC likely accelerated an already ongoing flight from the language amongst commercial game developers. For the rest of the 1980s, game development and assembly language would go hand in hand.

Due to a whole combination of factors — including miscommunication among marketing, engineering, and manufacturing, an ultimately pointless desire to be hardware compatible with the VIC-20, component problems, cost-cutting, and the sheer rush of putting a product together in such a limited time frame — the Commodore 64 ended up saddled with a disk system that would become, even more than the primitive BASIC, the albatross around the platform’s neck. It’s easily the slowest floppy-disk system ever sold commercially, on the order of thirty times slower than Steve Wozniak’s masterpiece, the Apple II’s Disk II system. Interacting with disks from BASIC 2.0, which was written before disk drives existed on PCs, requires almost as much patience as does waiting for a program to load. For instance, you have to type “LOAD ‘$’, 8” followed by ‘LIST’ just to get a directory listing. As an added bonus, doing so wipes out any BASIC program you might have happened to have in memory.

The disk system’s flaws frustrate because they dissipate a lot of potential strengths. Commodore had had a unique approach to disk drives ever since producing their first for the PET line circa 1979. A Commodore disk drive is a smart device, containing its own 6502 CPU as well as ROM and 2 K of RAM. The DOS used on other computers like the Apple II to tell the computer how to control the drive, manage the filesystem, etc., is unnecessary on a Commodore machine. The drive can control itself very well, thank you very much; it already knows all about that stuff. This brings some notable advantages. No separate DOS has to be loaded into the computer’s RAM, eating precious memory. DOS 3.3, for example, the standard on the Apple II Plus at the time of the Commodore 64’s introduction, eats up more than 10 K of the machine’s precious 48 K of RAM. Thus the Commodore 64’s memory edge was in practical terms even more significant than it appeared on paper. Because it’s possible to write small programs for the drive’s CPU to process and load them into the drive’s RAM, the whole system was a delight for hackers. One favorite trick was to load a disk-copying program into a pair of drives, then physically disconnect them from the computer. They would continue happily copying disks on their own, as long as the user kept putting more disks in. More practically for average users, it was often possible for games to play music or display animated graphics while simultaneously loading from the drive. Other computers’ CPU were usually too busy controlling the drive to manage this. Of course, this was a very good feature for this particular computer, because Commodore 64 users would be spending a whole lot more time than users of other computers waiting for their disk drives to load their programs.

Quality-control issues plagued the entire Commodore 64 line, especially in the first couple of years. One early reviewer had to return two machines before Commodore shipped him one that worked; some early shipments to stores were allegedly 80 percent dead on arrival. To go with all of their other problems, the disk drives were particularly unreliable. In one early issue, Compute!’s Gazette magazine stated that four of the seven drives in their offices were currently dead. The poor BASIC and unfriendly operating environment, the atrocious disk system, and the quality-control issues, combined with no option for getting the 80-column display considered essential for word processing and much other business software, kept the Commodore 64 from being considered seriously by most businesses as an alternative to the Apple II or IBM PC. Third-party solutions did address many of the problems. Various improved BASICs were released as plug-in cartridges, and various companies rewrote the systems software to improve transfer speeds by a factor of six or more. But businesses wanted machines that just worked for them out of the box, which Apple and IBM largely gave them while Commodore did not.

None of that mattered much to Commodore, at least for now, because they were soon selling all of the Commodore 64s they could make for use in homes. No, it wasn’t a perfect machine, not even with its low price (and dropping virtually by the month), its luxurious 64 K of memory, its versatile graphics, and its marvelous SID chip. But, like the Sinclair Spectrum that was debuting almost simultaneously in Britain, it was the perfect machine for this historical moment. Also like the Spectrum, it heralded a new era in its home country, where people would play — and make — games in numbers that dwarfed what had come before. For a few brief years, the premiere mainstream gaming platform in the United States would be a full-fledged computer rather than a console — the only time, before or since, that that has happened. We’ll talk more about the process that led there next time.

(As you might expect, much of this article is drawn from Brian Bagnall’s essential history of Commodore. The IEEE Spectrum article referenced above was also a gold mine.)

 
26 Comments

Posted by on December 17, 2012 in Digital Antiquaria, Interactive Fiction

 

Tags: ,

Summer Camp is Over

It’s difficult to exaggerate just what a phenomenon Atari and their VCS console were in the United States of the very early 1980s. The raw figures are astounding; nothing else I’ve written about on this blog holds a candle to Atari’s mainstream cultural impact. By the beginning of 1982 the rest of the business of their parent company, the longstanding media conglomerate Warner Communications, looked almost trivial in comparison. Atari reaped six times as much profit as Warner’s entire music division; five times as much as the film division. By the middle of the year 17 percent of American households owned a videogame console, up from 9 percent at the same time of the previous year. Atari all but owned this exploding market, to the tune of an 80 percent share. The company’s very name had become synonymous for videogames, like Kleenex is for tissue. People didn’t ask whether you played videogames; they asked whether you played Atari. As the company ramped up for the big Christmas season with their home version of the huge arcade hit Pac-Man as well as a licensed adaptation of the blockbuster movie of the year, E.T., they confidently predicted sales increases of 50 percent over the previous Christmas. But then, on December 7, they shocked the business world by revising those estimates radically downward, to perhaps a 10 or 15 percent increase. Granted, plenty of businesses would still love to have growth like that, but the fact remained that Atari for the first time had underperformed. Change was in the air, and everyone could sense it.

Those who had been watching closely and thoughtfully could feel the winds of change already the previous summer, when Atari’s infamously substandard version of Pac-Man sold in massive numbers, but not quite as massive numbers as the company and their boosters had predicted; when sales on the Mattel Intellivision and the brand new ColecoVision soared, presumably at the expense of Atari’s aged VCS; when Commodore continued to aggressively market their low-cost home computers as a better alternative to a games console, and continued to be rewarded with huge sales. The big question became what form the post-VCS future of gaming would take, assuming it didn’t just fade away like the Hula Hoop fad to which videogames were so often compared. There were two broad schools of thought, who would each prove to be right and wrong in their own ways. Some thought that the so-called “second generation” consoles, like the ColecoVision, would pick up Atari’s slack and the console videogame industry would continue as strong as ever. Others, however, looked to the PC industry, which VisiCalc and the IBM PC had legitimized even as Commodore was proving that people would buy computers for the home in huge numbers if the price was right. The VIC-20 may have been only modestly more capable than the Atari VCS, but as a proof of concept of sorts it certainly got people’s attention. With prices dropping and new, much more capable machines on the horizon, many analysts cast their lot with the home computer as the real fruition of the craze that the Atari VCS had started. Full-fledged computers could offer so much better, richer experiences than the consoles thanks to their larger memories, their ability to display text, their keyboards, their disk-based storage. The newest computers had much better graphics and sound than their console counterparts to boot. And of course you could do more than play games with a computer, like write letters or help Junior learn BASIC as a leg-up for a computer world soon to come.

An increasing appreciation of the potential of home computers and computer games by the likes of Wall Street meant big changes for the pioneers I’ve been writing about on this blog. Although most of the signs of these changes would not be readily visible to consumers until the following year, 1982 was the year that Big Capital started flowing into the computer-game (as opposed to the console-centric videogame) industry. Slick new companies like Electronic Arts were founded, and old-media corporations started commissioning software divisions. The old guard of pioneers would have to adapt to the new professionalism or die, a test many — like The Software Exchange, Adventure International, California Pacific, Muse, and Edu-Ware, among dozens of others — would fail. The minority that survived — like On-Line Systems (about to be rechristened Sierra On-Line), Brøderbund, Automated Simulations (about to be rechristened Epyx), Penguin, and Infocom — did so by giving their scruffy hacker bona fides a shave and a haircut, hiring accountants and MBAs and PR firms (thus the name changes), and generally starting to behave like real companies. Thanks to John Williams, who once again was generous enough to share his memories with me, I can write about how this process worked within On-Line Systems in some detail. The story of their transformative 1982, the year that summer camp ended, begins with a venture-capital firm.

TA Associates was founded in 1968 as one of the first of the new breed of VC firms. From the beginning, they were also one of the most savvy, often seeing huge returns on their investments while building a reputation for investing in emerging technologies like recombinant DNA and gene splicing at just the right moment. They were one of the first VC firms to develop an interest in the young PC industry, thanks largely to Jacqueline Morby, a hungry associate who came to TA (and to a career in business) only at age 40 in 1978, after raising her children. While many of her peers rushed to invest in hardware manufacturers like industry darling Apple, Morby stepped back and decided that software was where the action was really going to be. It’s perhaps difficult today to fully appreciate what a brave decision that was. Software was still an uncertain, vaguely (at best) understood idea among businesspeople at the time, as opposed to the concrete world of hardware. “Because it was something you couldn’t see, you couldn’t touch, you couldn’t hold,” she later said to InfoWorld, “it was a frightening thing to many investors.” For her first big software investment, in 1980, Morby backed what would ultimately prove to be the wrong horse: she invested in Digital Research, makers of CP/M, rather than Microsoft. Her record after that, however, would be much better, as she and TA maintained a reputation throughout the 1980s as one of (if not the) major players in software VC. She described her approach in a recent interview:

If you talk to enough entrepreneurs, you quickly figure out which of their ventures are the most promising. First, I would consider the year they were formed. If a company was three years old and employed 100 people, that meant something was going right. Then, after researching what their products did, I’d call them — cold. In those days, nobody called on the presidents of companies to say, “Hi, I’m an investor and I’m interested in you. Might I come out to visit and introduce myself?” But most of the companies said, “Come on out. There’s no harm in talking.” My calling companies led to many, many investments throughout the years.

When you look at the potential of a company, the most important questions to consider are, “How big is its market and how fast is it growing?” If the market is only $100 million, it’s not worth investing. The company can’t get very big. Many engineers never ask these questions. They just like the toys that they’re inventing. So you find lots of companies that are going to grow to $5 million or so in sales, but never more, because the market for their products is not big enough.

By 1982, Morby, now a partner with TA thanks to her earlier software success, had become interested in investing in an entertainment-software company. If computer games were indeed to succeed console games once people grew tired of the limitations of the Atari VCS and its peers, the potential market was going to be absolutely huge. After kicking tires around the industry, including at Brøderbund, she settled on On-Line Systems as just the company for her — unique enough to stand out with its scenic location and California attitude but eager to embrace the latest technologies, crank out hits, and generally take things to the next level.

When someone offers you millions of dollars virtually out of the blue, you’re likely to think that this is all too good to be true. And indeed, venture capital is always a two-edged sword, as many entrepreneurs have learned to their chagrin. TA’s money would come only with a host of strings attached: TA themselves would receive a 24 percent stake in On-Line Systems; Morby and some of her colleagues would sit on the board and have a significant say in the company’s strategic direction. Most of all, everyone would have to clean up their act and start acting like professionals, starting with the man at the top. Steven Levy described Ken Williams in his natural habitat in Hackers:

Ken’s new office was just about buried in junk. One new employee later reported that on first seeing the room, he assumed that someone had neglected to take out a huge, grungy pile of trash. Then he saw Ken at work, and understood. The twenty-eight-year-old executive, wearing his usual faded blue Apple Computer T-shirt and weather-beaten jeans with a hole in the knee, would sit behind the desk and carry on a conversation with employees or people on the phone while going through papers. The T-shirt would ride over Ken’s protruding belly, which was experiencing growth almost as dramatic as his company’s sales figures. Proceeding at lightning pace, he would glance at important contracts and casually throw them in the pile. Authors and suppliers would be on the phone constantly, wondering what had happened to their contracts. Major projects were in motion at On-Line for which contracts hadn’t been signed at all. No one seemed to know which programmer was doing what; in one case two programmers in different parts of the country were working on identical game conversions. Master disks, some without backups, some of them top secret IBM disks, were piled on the floor of Ken’s house, where one of his kids might pick it up or his dog piss on it. No, Ken Williams was not a detail person.

If Ken was not detail-oriented, he did possess a more valuable and unusual trait: the ability to see his own weaknesses. He therefore acceded wholeheartedly to TA’s demands that he hire a squad of polished young managers with suits, resumes, and business degrees. He even let TA field most of the candidates. He hired as president Dick Sunderland, a fellow he had worked for before the birth of On-Line, where he had been loathed by the hackers under him as too pedantic, too predictable, too controlling, too boring. To Ken (and TA) this sounded like just the sober medicine On-Line would need to compete in the changing industry.

Which is not to say that all of this new professionalism didn’t also come with its attendant dangers. John Williams states frankly today that “some of those new managers came in with the idea that they would run the business after they pushed Ken to the side or out.” (It wasn’t clear to the Williams whether they came up with that idea on their own or TA subtly conveyed it to them during the hiring process.) Ken also clashed constantly with his own hire Sunderland; the latter would be gone again within a year. He was walking a difficult line, trying to instill the structure his company needed to grow and compete and be generally taken seriously by the business community without entirely losing his original vision of a bunch of software artisans creating together in the woods. As org charts started getting stapled to walls, file cabinets started turning up locked, and executive secretaries started appearing as gatekeepers outside the Williams’ offices, many of the old guard saw that vision as already dying. Some of them left. Needless to say, Ken no longer looked for their replacements in the local liquor store.

Ken proved amazingly adept at taking the good advice his new managers had to offer while remaining firmly in charge. After a while, most accepted that he wasn’t going anywhere and rewarded him with a grudging respect. Much of their advice involved the face that On-Line presented to the outer world. For a long time now everyone had agreed that the name “On-Line Systems,” chosen by Ken back when he had envisioned a systems software company selling a version of FORTRAN for microcomputers, was pretty awful — “generic as could be and dull as dishwater” in John Williams’s words. They decided on the new name of “Sierra On-Line.” The former part conveyed the unique (and carefully cultivated) aura of backwoods artisans that still clung to the company even in these more businesslike days, while the latter served as a bridge to the past as well as providing an appropriate high-tech flourish (in those times “On-Line” still sounded high-tech). They had a snazzy logo featuring a scenic mountain backdrop drawn up, and revised and slicked-up their packaging. The old Hi-Res Adventure line was now SierraVenture; the action games SierraVision.

Sierra hired Barbara Hendra, a prominent New York PR person, to work further on their image. Surprisingly, the erstwhile retiring housewife Roberta was a big force behind this move; her success as a game designer had revealed an unexpected competitive streak and a flair for business of her own. Hendra nagged Roberta and especially Ken — he of the faded, paunch-revealing tee-shirt and the holey jeans — about their dress and mannerisms, teaching them how to interact with the movers and shakers in business and media. She arranged a string of phone interviews and in-person visits from inside and outside the trade press, including a major segment on the prime-time news program NBC Magazine. Ken was good with these junkets, but Roberta — pretty, chic, and charming — was the real star, Sierra’s PR ace in the hole, the antithesis of the nerds so many people still associated with computer games. When someone like Roberta said that computer games were going to be the mass-market entertainment of the future, it somehow sounded more believable than it did coming from a guy like Ken.

In the midst of all this, another windfall all but fell into Sierra’s lap. Christopher Cerf, a longtime associate of The Children’s Television Workshop of Sesame Street fame, approached them with some vague ideas about expanding CTW into software. From there discussions moved in the direction of a new movie being developed by another CTW stalwart: Jim Henson, creator of the Muppets. For Ken, who had been frantically reading up on entertainment and media in order to keep up with the changes happening around his company, the idea of working with Henson was nothing short of flabbergasting, and not just because the Muppets were near the apogee of their popularity on the heels of two hit movies, a long-running television series, and a classic Christmas special with John Denver. John Williams:

Ken developed a kind of worship for two men as he began to study up on entertainment. One was Walt Disney and the second was Jim Henson. Both were men who were enablers — not known as much for their own artistry so much as their ability to bring artists and business together to make really big things happen — and that was what Ken strived for. Walt was already gone of course, but Henson was still alive.

Ken Williams (right) hobnobbing with Jim Henson

Ken Williams (right) hobnobbing with Jim Henson

The almost-completed movie was called The Dark Crystal. In the works on and off for five years, it marked a major departure for Henson and his associates. Although populated with the expected cast of puppets and costumed figures (and not a single recognizable human), there were no Muppets to be found in it. It was rather a serious — even dark — fantasy tale set in a richly organic landscape of the fantastic conceived by Henson’s creative partner on the project, designer and illustrator Brian Froud. In an early example of convergence culture, Henson and friends were eager to expand the world of the movie beyond the screen. They already planned a glossy hardcover book, a board and a card game, and a traveling art exhibit. Now an adventure game, to be designed by Roberta, sounded like a good idea. Such a major media partnership was a first for a computer-game publisher, although Atari had been doing licensed games for some time now for the VCS. Anyone looking for a sign that computer games were hitting the big time needed look no farther.

The Dark Crystal

For the Williamses, the changes that the venture capitalists had brought were nothing compared to this. Suddenly they were swept into the Hollywood/Manhattan media maelstrom, moving in circles so rarified they’d barely realized they existed outside of their televisions. John Williams again:

I remember this time very well. Let me put it in a very personal perspective. I’m like 22 or 23. A guy who grew up in Wheaton, Illinois (which is just down the street from absolutely nowhere) and currently living in a town of like 5000 people 50 miles from the nearest traffic light. Now imagine this young wet-behind-the-ears punk walking through the subways and streets of Manhattan with Jim Henson, getting interviewed on WNBC talk radio while wearing his first real tailored suit. Eating at “21” with Chris Cerf, and taking limos to meet with publishing companies on Times Square. That was me – and I was just along for the ride. For Ken and Roberta, it was on a whole other level.

Much of the Williams’ vision for computerized entertainment, of games as the next great storytelling medium to compete with Hollywood, was forged during this period. If they had ever doubted their own vision for Sierra, hobnobbing with the media elite convinced them that this stuff was going to get huge. Years before the technology would become practical, they started toying with the idea of hiring voice actors and considering how Screen Actors Guild contracts would translate to computer games.

But for here and now there was still The Dark Crystal, in the form of both movie and game. Both ended up a bit underwhelming as actual works when set against what they represent to Sierra and the computer-game industry.

The movie is in some ways an extraordinary achievement, a living alien world built from Styrofoam, animatronics, and puppets. It’s at its most compelling when the camera simply lingers over the landscape and its strange inhabitants. Unfortunately, having created this world, Henson and company don’t seem quite sure what to do with it. The story is an unengaging quest narrative which pits an impossibly, blandly good “chosen one,” the Gelfling Jen, against the impossibly evil race of the Skeksis. It’s all rather painfully derivative of The Lord of the Rings: two small protagonists carry an object of great power into danger, with even a Gollum stand-in to dog their steps. Nor do the endless melodramatic voiceovers or the hammy voice acting do the film any favors. It’s a mystery to whom this film, too dark and disturbing for children and too hokey and simplistic for adults and with none of the wit and joy that marked Henson’s Muppets, was meant to really appeal. There have been attempts in recent years to cast the movie, a relative commercial disappointment in its time, as a misunderstood masterpiece. I’m not buying it. The Dark Crystal is no Blade Runner.

The game is similarly difficult to recommend. Like The Hobbit, The Dark Crystal‘s quest narrative maps unusually well to an adventure game, but Roberta showed none of the technical ambition that Veronika Megler displayed in making a game of her source material. The Dark Crystal suffers from the same technical and design flaws that mark all of the Hi-Res Adventure line: absurd puzzles, bad parser, barely-there world model, you’ve heard the litany before from me. In the midst of the problems, however, there are more nods toward story than we’re used to seeing in our old-school adventure games, even if they sometimes smack more of the necessities born of doing a movie adaptation than a genuine striving to advance the medium. Early on we get by far the longest chunk of expository text to make it into any of the Hi-Res Adventure line.

The Dark Crystal

Unusually, the game is played in the third person, with you guiding the actions of the movie’s hero Jen and, later, both Jen and his eventual sidekick/tentative love interest, Kira. The duality of this is just odd; you never quite know who will respond to your commands. The third-person perspective extends to the graphics, which show Jen and Kira as part of each scene.

The Dark Crystal

As Carl Muckenhoupt mentions in his (highly recommended) posts about the game, it’s tempting to see the graphics as a transitional step between the first-person perspective of Roberta’s earlier Hi-Res Adventure games and the fully animated adventure games that she would make next — games that would have you guiding your onscreen avatar about an animated world in real-time. It’s also very possible that working with the fleshed-out story and world of someone else inspired Roberta to push her own future original works further in the direction of real storytelling. Notably, before The Dark Crystal none of her games bothered to define their protagonists or even give them names; after it, all of them did.

Whatever influence it had on Roberta’s design approach, the fact remains that she seemed less passionate about The Dark Crystal itself than she had been about her previous games. With the licensing deal having been finalized as the movie was all but ready for release, The Dark Crystal was what John Williams euphemistically calls a “compressed timeline” game. Roberta spent only a month or so on the design while dealing with all of the distractions of her new life in the spotlight, then turned the whole thing over to Sierra’s team of in-house programmers and artists. It all feels a bit rote. John:

The simple truth is that the whole of the Dark Crystal project was, in the end, a business decision and not really driven by our developers or our creative people. I think that’s really why this is one of the least cared about and least remembered products in the Sierra stable. Look back at that game and there’s really none of Roberta’s imagination in there – and the programmers, artists, etc., involved were basically mimicking someone else’s work and creating someone else’s vision. The lack of passion shows.

The player must not so much do what seems correct for the characters in any given situation as try to recreate the events of the film. If she succeeds, she’s rewarded with… exactly what she already saw in the movie.

The Dark Crystal

The Dark Crystal

Adapting a linear story to an interactive medium is much more difficult than it seems. This is certainly one of the least satisfying ways to approach it. The one nod toward the dynamism that marks The Hobbit are a couple of minions sent by the Skeksis to hunt you down: an intelligent bat and a Garthim, a giant, armored, crab-like creature with fearsome pincers. If you are spotted in the open by the bat, you have a limited amount of time to get under cover — trees, a cave, or the like — before a Garthim comes to do you in. That’s kind of impressive given the aging game engine, and it does help with the mimesis that so many of the game’s other elements actively work against. But alas, it’s just not enough.

Even with the rushed development schedule, the game didn’t arrive in stores until more than a month after the movie’s December 7, 1982, premiere. After, in other words, the big Christmas buying season. That, along with the movie’s lukewarm critical reception and somewhat disappointing performance at the box office, likely contributed to The Dark Crystal not becoming the hit that Sierra had expected. Its sales were disappointing enough to sour Sierra on similar licensing deals for years to come. Ken developed a new motto: “I don’t play hits, I make them.”

Of course, it also would have been unwise to blame The Dark Crystal‘s underperformance entirely on timing or on its being tied to the fate of the movie. The old Hi-Res Adventure engine, which had been so amazing in the heyday of The Wizard and the Princess, was getting creaky with age, and had long since gone past the point of diminishing commercial returns; not only The Dark Crystal but also its immediate predecessor, the epic Time Zone, had failed to meet sales expectations. This seventh Hi-Res Adventure would therefore be the last. Clearly it was time to try something new if Sierra intended to keep their hand in adventure games. That something would prove to be as revolutionary a step as had been Mystery House. The Dark Crystal, meanwhile, sneaked away into history largely unloved and unremembered, one of the first of a long industry tradition of underwhelming, uninspired movie cash-ins. The fact that computer games had reached a stage where such cash-ins could exist is ultimately the most important thing about it.

If you’d like to try The Dark Crystal for yourself despite my criticisms, here’s the Apple II disk images and the manual.

(As always, thanks to John Williams for his invaluable memories and insights on these days of yore. In addition to the links embedded in the text, Steven Levy’s Hackers and the old Atari history Zap! were also wonderful sources. Speaking of Atari histories: I look forward to diving into Marty Goldberg and Curt Vendel’s new one.)

 
25 Comments

Posted by on December 12, 2012 in Digital Antiquaria, Interactive Fiction

 

Tags: ,