RSS

Category Archives: Interactive Fiction

Lisa

In mid-1978 Apple Computer hired Trip Hawkins, a 25-year-old with a newly minted MBA from Stanford, to work in the marketing department. He quickly became a great favorite of Steve Jobs. The two were of similar ages and similar dispositions, good looking and almost blindingly charismatic when they turned on the charm. They were soon confirmed running buddies; Hawkins later told of “smoking dope” in a Vegas hotel room during a CES show, then going down to shoot craps all night. Less superficially, they thought differently than both the technicians and engineers at Apple (like Steve Wozniak) and the older, more conventional businessmen there (like Mike Markkula and Michael Scott). Their fondest dreams were not of bytes or market share, but rather of changing the way people lived through technology. Jobs monopolized much of Hawkins’s time during the latter part of 1978, as the two talked for hours on end about what form a truly paradigm-shifting computer might take. The ideas that began here would retain, through years of chaos to come, the casual code name they initially gave them: “Lisa.”

There’s been some confusion about the origins of the name. Years later, when they began selling Lisa as an actual product, Apple tried to turn it into LISA, short for “Local Integrated Software Architecture.” This was so obviously tortured that even the fawning computer press to whom they first promoted the machine had some fun with it; “Let’s Invent Some Acronyms” was a popular variation. Some sources name Lisa as the name of “the original hardware engineer’s daughter.” Yet it’s hard to get past the fact that just before all those long conversations with Hawkins Jobs had a daughter born to an on-again/off-again girlfriend he had known since high school. They named her Lisa Nicole. The story of what followed is confused and not terribly flattering to Jobs personally (not that it’s difficult to find examples of the young Jobs behaving like a jerk). After apparently being present at the birth and helping to name the baby, not to mention naming his new pet project after her, something caused Jobs to have a sudden change of heart. He denied paternity vigorously; when asked who he thought the real father was, he charmingly said that it could be any of about 28 percent of the male population of the country. Even when a court-ordered paternity test gave him about a 95 percent chance of being the father, he continued to deny it, claiming to be sterile. A few years later, when Jobs was worth some $210 million, he was still cutting a check each month to Lisa’s mother for exactly the amount the court had ordered: $385. Only slowly and begrudgingly would he accept his daughter in the years to come. At the end of that process he finally acknowledged the origin of the computer’s name: “Obviously it was named for my daughter,” he told his official biographer Walter Isaacson shortly before his death. The “original hardware engineer” apparently referenced above, Ken Rothmuller, was even more blunt: “Jobs is such an egomaniac, do you really think he would have allowed such an important project to be named after anybody but his own child?”

Jobs and Hawkins were convinced that the Lisa needed to be not just more powerful than the likes of the Apple II, but also — and here is the real key — much easier, much more fun for ordinary people to use. They imagined a machine that would replace esoteric key presses and cryptic command prompts with a set of simple on-screen menus that would guide the user through her work every step of the way. They conceived not just a piece of hardware waiting for outside programmers to make something of it, like the Apple II, but an integrated software/hardware package, a complete computing environment tailored to friendliness and ease of use. Indeed, the software would be the real key.

But of course making software more friendly would put unprecedented demands on the hardware. This was as true then as it is today. As almost any programmer will tell you, the amount of work that goes into a program and the amount of computing horsepower it demands are directly proportional to how effortlessly it seems to operate from the user’s perspective. Clearly Jobs and Hawkins’s ideas for Lisa were beyond the capabilities of the little 6502 inside the Apple II. Yet among the other options available at the time only Intel’s new 16-bit 8086 looked like it might have the power to do the job. Unfortunately, Apple and their engineers disliked Intel and its architecture passionately, rendering that a non-option. (Generally computer makers have broken down into two camps: those who use Intel chips and derivatives such as the Zilog Z80, and those who use other chips. Until fairly recently, Apple was always firmly in the latter camp.) In the spring of 1979, with the Apple II Plus finished and with most of the other engineers occupied getting the Sara project (eventually to be known as the Apple III) off the ground, Woz therefore set to work on one hell of an ambitious project. He would make a brand new CPU in-house for Lisa, using a technique he had always favored called bit slicing.

Up to this time Lisa had had little official existence within Apple. It was just a ground for conjecture and dreaming by Jobs and his closest circle. But on July 30, 1979, it took official form at last, when Ken Rothmuller, like Woz late of nearby Hewlett Packard, came on-board to lead the project under the close eye of Jobs, who divided his time between Sara and Lisa. Sara was now envisioned as the immediate successor to the Apple II, a much improved version of the same basic technology; Lisa as the proverbial paradigm shift in computing that would come somewhat later. Most of the people whom Rothmuller hired were also HP alumni, as were many of those working on Sara; Apple in these days could seem like almost a divisional office of the larger company. This caused no small chagrin to Jobs, who considered the HP engineers the worst sort of unimaginative, plodding “Clydesdales,” but it was inevitable given Apple’s proximity to the giant.

While they waited for Woz’s new processor, the Lisa people started prototyping software on the Apple II. Already a bit-mapped display that would allow the display of images and various font styles was considered essential. The early software thus ran through a custom-built display board connected to the Apple II running at a resolution of 356 X 720. At this stage the interface was to be built around “soft keys.” Each application would always show a menu of functions that were mapped to a set of programmable function keys on the keyboard. It was a problematic approach, wasteful of precious screen real estate and limited by the number of function keys on the keyboard, but it was the best anyone had yet come up with.

The original Lisa user interface, as of October 8, 1979. Note the menu of "soft keys" at the bottom.

The original Lisa user interface, circa autumn 1979. Note the menu of “soft keys” at the bottom.

That October Rothmuller’s team assembled the first working Lisa computer around a prototype of Woz’s processor. Just as they were doing so, however, they became aware of an alternative that would let them avoid the trouble and expense of refining a whole new processor and also avoid dealing with the idiosyncrasies of Woz, who was quickly falling out of favor with management. Their new processor would have a tremendous impact on computing during the decade to come. It was the Motorola 68000.

The 68000 must have seemed like the answer to a prayer. At a time when the first practical 16-bit chips like the Intel 8086 were just making their presence felt, the 68000 went one better. Externally, it fetched and stored from memory like a 16-bit chip, but it could perform many internal operations as a 32-bit chip, while a 24-bit address bus let it address a full 16 M of memory, a truly mind-boggling quantity in 1979. It could be clocked at up to 8 MHz, and had a beautiful system of interrupts built in that made it ideal for the centerpiece of a sophisticated operating system like those that had heretofore only been seen on the big institutional computers. In short, it was simply the sleekest, sexiest, most modern microprocessor available. Naturally, Apple wanted it for the Lisa. Motorola was still tooling up to produce the chips — they wouldn’t begin coming out in quantity until the end of 1980 — but Apple was able to finagle a deal that gave them access to prototypes and pre-release samples. Woz’s processor was put on the shelf. The Lisa was now to be a 68000 machine, the CPU of the future housed in the computing paradigm of the future. It’s at this stage, with the Lisa envisioned as a soft-key-driven 68000-based computer, that Xerox PARC enters the picture.

The story of Steve Jobs’s visit to PARC in December of 1979 has passed into computer lore as a fateful instant where everything changed, one to stand alongside IBM’s visit to Digital Research the following year. Depending on your opinion of Jobs and Apple, they would either go on to refine, implement, and popularize these brilliant ideas about which Xerox themselves were clueless, or shamelessly rip off the the work of others — and then have the hypocrisy to sue still others for trying to do the same, via their “look and feel” battle with Microsoft over the Windows interface. In truth, the PARC visit was in at least some ways less momentous than conventional wisdom would have it. To begin with, the events that set the meeting in motion had little to do with the future of computing as implemented through the Lisa project or anywhere else, and a lot to do with a pressing, immediate worry afflicting Mike Markkula and Michael Scott.

Back in early 1978, Apple had been the first PC maker to produce a disk system, using the new technology of the 5 1/4″ floppy disk which had been developed by a company called Shugart Associates. Woz’s Disk II system was as important to the Apple II’s success as the Apple II itself, a night-and-day upgrade over the old slow and balky cassette tapes that enabled, amongst many other advances, the Apple II’s first killer app, VisiCalc. Apple initially bought its drives direct from Shugart, the only possible source. However, they soon became frustrated with the prices they were paying (apparently Apple’s legendarily high profit margins were justifiable for them, but not for others) and with the pace of delivery. They therefore found a cut-rate electronics manufacturer in Japan, Alps Electric Company, whom they helped to clone the Shugart drives. Through Alps they were able to get all the drives they wanted, and get them much cheaper than they could through Shugart. Trouble was, blatantly cloning Shugart’s patented technology in this way left them vulnerable to all sorts of legal action. By this time, Apple had a reputation as an up-and-coming company to watch, and was raising money toward an eventual IPO from a variety of investors. When he heard that Xerox’s financial people were interested in making an investment, Scott suddenly saw a way to protect the company from Shugart. Shugart, you see, was wholly owned by Xerox. Scott reasoned, correctly, that Xerox would not allow one of its subsidiaries to sue a company in which it had a vested interest. Apple and Xerox quickly concluded an agreement that allowed the latter to buy 100,000 shares of the former for a rather paltry $1 million. As a sort of adjunct, the two companies also agreed to do some limited technology exchange. It was this that led to Jobs’s legendary visit to PARC some months later.

The fact that it took him so long to finally visit shows that PARC’s technology was not so high on Jobs’s list of priorities. The basics of what PARC had to offer were hardly any big secret amongst people who thought about such things during the 1970s. It was something of a rite of passage for ambitious computer-science graduate students, at least those from the nearby universities, to take a tour and get a glimpse of what everyone was increasingly coming to regard as the interface of the future. Several people at Apple and even on the Lisa team were very aware of PARC’s work. Many of their ideas had already made their way into the Lisa. Reports vary somewhat, but some claim that the Lisa already had the concept of windowing and even an optional mouse before the visit to PARC. And certainly the Alto’s bitmapped display model was already present. The Lisa team member who finally convinced Jobs to visit PARC, Bill Atkinson, later claimed to wish he had never done so: “Those one and a half hours tainted everything we did, and so much of what we did was original research.”

The legendary visit to PARC was actually two visits, which took place a few days apart. The first involved a small group, perhaps no more than Atkinson and Jobs themselves. The second was a much lengthier and more in-depth demonstration that spanned most of a day, and involved most of the principal players on Lisa, including Hawkins. As Jobs later freely admitted, he saw three breakthrough technologies at PARC — the GUI, the LAN, and object-oriented programming in the form of Smalltalk — but was so taken by the first that he hardly noticed the other two. Jobs was never particularly interested in how technology was constructed, so his lack of engagement with the third is perhaps understandable. His inability to get the importance of networking, however, would become a major problem for Apple in the future. (A fun anecdote has the Jobs of a few years later, tired of being bothered about Apple’s piss-poor networking, throwing a floppy disk at his interlocutor, saying, “Here’s your fucking network!”)

Even if there weren’t as many outright revelations at PARC as legend would have it, it’s certainly true that Jobs and the rest of the Lisa team found what they saw there hugely inspiring. Suddenly all of these ideas that they had been discussing in the abstract were there, before them, realized in actual pixels. PARC showed them that it could be done. As Hawkins later put it, “We got religion.”

Of course, every religion needs a sacred text. Hawkins provided one in the spring of 1980 when he finished the 75-page “Lisa Marketing Requirements.” Far more than what its name would imply, it was in fact a blueprint for the entire project as Jobs and Hawkins now envisioned it. It’s a fascinating read today. Lisa will “portray a friendlier ‘personality’ and ‘disposition’ than ordinary computers to allow first-time users to develop the same emotional attachment with their system that they have with their car or their stereo.” Over and over occurs a phrase that was supposed to be the mission statement for PARC: “the office of the future.” Other parts, however, put the lie to the notion that Apple decided to just junk everything it had already done on the Lisa and clone the Xerox Alto. While a mouse is now considered essential, for instance, they are still holding onto the old notion of a custom keyboard with soft keys. The MR document was Hawkins’s last major contribution to Lisa. Soon after writing it, he became Apple’s head of marketing, limiting his role with Lisa.

While the HP contingent remained strongly represented, as the team grew Apple began poaching from PARC itself, eventually hiring more than fifteen ex-PARCers. Those who weren’t on board with the new, increasingly PARC-centric direction found it tough going. Rothmuller, for instance, was unceremoniously dumped for thinking too traditionally. And then, unexpectedly, Jobs himself was gone.

As 1980 drew to a close, with the IPO looming and the Apple III already starting to show signs of becoming a fiasco, CEO Michael Scott decided that he had to impose some order on the company and neuter Jobs, whose often atrocious treatment of others was bringing a steady rain of complaints down upon his desk. He therefore reorganized the entire company along much stricter operational lines. Jobs begged for the newly created Lisa division, but Scott was having none of it. Lisa after all was coming more and more to represent the long-term future of Apple, and after watching the results of his meddling in the Sara project Scott had decided that he didn’t want Jobs anywhere near it. If Jobs would just confine himself to joining Woz as Apple’s token spokesman and mascot, that would be more than enough of a contribution, thank you very much. He placed Lisa in the hands of yet another steady ex-HP man, John Couch. Jobs went off in a huff, eventually to busy himself with another project called Macintosh. From now on he would be at war with his erstwhile pet. One of his first strikes was to lure away Atkinson, an ace graphics programmer, to the Macintosh project.

By now 68000-based prototype machines were available and software development was ramping up. Wayne Rosing was now in charge of hardware; Bruce Daniels, who had co-written the original MIT Zork and written Infocom’s first interpreter for the Apple II, in charge of the operating system; and Larry Tesler, late of PARC, in charge of the six integrated applications to be at the heart of the office of the future. They were: Lisa Draw; Lisa Write, a what-you-see-is-what-you-get word processor in the tradition of PARC’s Gypsy; Lisa Project, a project manager; Lisa Calc, a spreadsheet; Lisa List, a database; and Lisa Graph. From a very early date the team took the then-unusual step of getting constant feedback on the interface from ordinary people. When the time came to conduct another round of testing, they would go to Apple’s Human Resources department and request a few new hires from the clerical pool or the like who had not yet been exposed to Lisa. Tesler:

We had a couple of real beauties where the users couldn’t use any of the versions that were given to them and they would immediately say, “Why don’t you just do it this way?” and that was obviously the way to do it. So sometimes we got the ideas from our user tests, and as soon as we heard the idea we all thought, “Why didn’t we think of that?” Then we did it that way.

It’s difficult to state strongly enough what a revolutionary change this made from the way that software had been developed before, in which a programmer’s notion of utilitarian functionality was preeminent. It was through this process that the team’s most obvious addition to the PARC template arose: pull-down menus. User testing also led them to decide to include just one button on the mouse, in contrast to the PARC mice which had three or more. While additional buttons could be useful for advanced users, new users found them intimidating. Thus was born Apple’s stubborn commitment to the single-button mouse, which persisted more than twenty years. The final major piece of the user-interface puzzle, of the GUI blueprint which we still know today, came in June of 1981 when the team first saw the Xerox Star. The desktop metaphor was so obviously right for the office of the future that they immediately adopted it. Thus the Lisa in its final form was an amalgam of ideas taken from PARC and from the Star, but also represents significant original research.

As 1982 began, the picture of what Lisa should be was largely complete. Now it just came down to finishing everything. As the year wore on, the milestones piled up. In February the system clipboard went in, allowing the user to cut and paste not just between the six bundled applications but presumably any that might be written in the future — a major part of the Lisa vision of a unified, consistent computing environment. On July 30, 1982, the team started all six applications at once on a single computer to test the capabilities of the advanced, multitasking operating system. On September 1, the Lisa was declared feature complete; all that remained now was to swat the bugs. On October 10, it was demonstrated to Apple’s sales force for the first time.

The Apple Lisa. Not the two Twiggy drives to the right. The 5 MB hard drive sits on top.

The Apple Lisa. Note the two Twiggy drives to the right. The 5 MB hard drive sits on top.

John Couch’s people had a lot to show off. The Lisa’s hardware was quite impressive, with its high-resolution bitmapped display, its mouse, and its astonishing 1 full MB of memory. (To understand just how huge that number was in 1982, consider that the IBM PC had not been designed to even support more than 640 K, a figure IBM regarded as a strictly theoretical upper limit no one was ever likely to reach in the real world.) Yet it was the software that was the most impressive part. To use an overworked phrase that in this case is actually deserved, Lisa OS was years ahead of its time. Aside from only the hacker-oriented OS-9, it was the first on a PC to support multitasking. If the user started up enough programs to exceed even the machine’s 1 MB of memory, a virtual-memory scheme kicked in to cache the excess onto the 5 MB hard drive. (By way of comparison, consider that this level of sophistication would not come to a Microsoft operating system until Windows 3.0, released in 1990.) It was possible to cut and paste data between applications effortlessly using the system clipboard. With its suite of sophisticated what-you-see-is-what-you-get applications that benefited greatly from all that end-user testing and a GUI desktop that went beyond even what had been seen on the Star (and arguably beyond anything that would be seen for the rest of the 1980s) in consistency and usability, the Lisa was kind of amazing. Apple confidently expected it to change the world, or at least to remake the face of computing, and in this case their hubris seemed justified.

Apple officially announced the Lisa on January 19, 1983, alongside the Apple IIe in an event it labeled “Evolution/Revolution.” (I trust you can guess which was which.) They managed to convince a grudging Jobs, still the face of the company, to present these two machines that he ardently hated in his heart of hearts. It must have especially cut because the introduction was essentially a celebration of the bet he was about to lose with Couch — that being that he could get his Macintosh out before the Lisa. Jobs had come to hate everything about the Lisa project since his dismissal. He saw the Lisa team, now over 200 people strong when the business and marketing arms were taken into account, as bloated and coddled, full of the sort of conservative, lazy HP plodders he loathed. That loathing extended to Couch himself, whose low-key style of “management by walking around” and whose insistence that his people work sane hours and be given time for a life outside of Apple contrasted markedly with the more high-strung Jobs.

But then, Jobs had much to be unhappy about at this point. Time magazine had planned to make him its “Man of the Year” for 1982, until their journalists, digging around for material for the feature, unearthed a series of rather unflattering revelations about Jobs’s personal life, his chequered, contentious career at Apple, and the hatred many even in his own company felt toward him. Prominent among the revelations were the first reports of the existence of Jobs’s daughter Lisa and Jobs’s shabby treatment of her and her mother. In the face of all this, Time turned the Jobs feature into an elegiac for a brilliant young man corrupted and isolated from his erstwhile friends by money and fame. (Those who had known Jobs before his “corruption” mostly just shrugged at such a Shakespearian portrayal and said, well, he’d always kind of been an asshole.) The Man of the Year feature, meanwhile, became the computer itself — a weird sort of “man,” but what was the alternative? Who else symbolized the face of the computer age to mainstream America better than Jobs? This snub rankled Jobs greatly. It didn’t make Apple any too happy either, as now their new wonder-computer was hopelessly ensnared with the tawdry details of Jobs’s personal life. They had discussed changing the name many times, to something like the Apple IV or — this was Trip Hawkins’s suggestion — the Apple One. But they had ended up keeping “Lisa” because it was catchy, friendly, and maybe even a little bit sexy, and separated the new machine clearly from both the Apple III fiasco and everything else that had come before from Apple. Now they wished they could change it, but, with advertising already printed and announcements made, there was nothing to be done. It was the first ominous sign of a launch that would end up going nothing like they had hoped and planned.

Still, as time rolled on toward June 1983, when the Lisa would actually start shipping, everything seemed to be going swimmingly. Helped along by ecstatic reviews that rightly saw the Lisa as a potentially revolutionary machine, Apple’s stock soared to $55 on the eve of the first shipments, up from $29 at the time of the Evolution/Revolution events. Partly this was down to the unexpectedly strong sales of the Apple IIe, which unlike the Lisa had gone into production immediately after its announcement, but mostly it was all about the sexier Lisa. Apple already had 12,000 orders in the queue before the first machine shipped.

But then, with the Lisa actually shipping at last, the orders suddenly stopped coming. Worse, many of those that had been already placed were cancelled or returned. Within the echo chamber inside Apple, Lisa had looked like a surefire winner, but that perception had depended upon ignoring a lot of serious problems with the computer itself, not to mention some harsh marketplace realities, in favor of the Lisa’s revolutionary qualities. Now the problems all started becoming clear.

Granted, some of the criticisms that now surfaced were hilariously off-base in light of a future that would prove the Lisa right about so many fundamentals. As always, some people just didn’t get what Lisa was about, were just too mired in the conventional wisdom. From a contemporary issue of Byte:

The mouse itself seems pointless; why replace a device you’re afraid the executive is afraid of (the keyboard) with another unfamiliar device? If Apple was seriously interested in the psychology involved it would have given said executive a light pen.

While the desktop-with-icons metaphor may be useful, were I a Fortune 500 company vice-president, I would be mortally insulted that a designer felt my computer had to show me a picture of a wastebasket to direct me to the delete-file function. Such offensive condescension shows up throughout the design, even in the hardware (e.g., labeling the disk release button “Disk Request”).

I’d hoped (apparently in vain) that Apple finally understood how badly its cutesy, whimsical image hurts its chances of executive-suite penetration. This image crops up in too many ways on the Lisa: the Apple (control) key, the mouse, and on and on. Please, guys, the next time you’re in the executive-suite waiting room, flip through the magazines on the table. You’ll find Fortune, Barron’s, Forbes, etc., but certainly not Nibble. There’s a lesson there.

Other criticisms, however, struck much closer to home. There was one in particular that came to virtually everyone’s lips as soon as they sat down in the front of a Lisa: it was slow. No matter how beautiful and friendly this new interface might look, actually using it required accepting windows that jerked reluctantly into place seconds after pulling on them with the mouse, a word processor that a moderately skilled typist could outrace by lines at a time, menus that drew themselves line by laborious line while you sat waiting and wondering if you were ever going to be able to just get this letter finished. Poor performance had been the dirty little secret plaguing GUI implementations for years. Certainly it had been no different on the Alto. One PARC staffer estimated that the Alto’s overall speed would have to be improved by a factor of ten for it to be a viable commercial product outside the friendly confines of PARC and its ultra-patient researchers. Apple only compounded the problem with a hardware design that was surprisingly clunky in one of its most vital areas. Bizarrely on a machine that was ultimately going to be noticed primarily for its display, they decided against adding any specialized chips to help generate said display, choosing instead to dump the entire burden onto the 68000. Apple would not even have needed to design its own custom display chip, a task that would have been difficult without the resources of, say, Commodore’s MOS Technologies subsidiary. Off-the-shelf solutions, like the NEC 7220, were available, but Apple chose not to avail themselves of them. To compound the problem still further, they throttled the Lisa’s 68000 back to 5 MHz from its maximum of 8 MHz to keep it in sync with the screen refreshes it needed to constantly perform. With the 68000 so overloaded and strangled, the Lisa could seem almost as unusably slow as the old Alto at many tasks.

Other problems that should have been obvious before the Lisa was released also cropped up. The machine used a new kind of floppy disk drive that Apple had been struggling with in-house since all the way back in 1978. Known as Twiggy informally, the disks had the same external dimensions as the industry-standard 5 1/4″ disks, but were of a new design that allowed greater capacity, speed, and (theoretically) reliability. Trouble was, the custom disks were expensive and hard to find (after all, only the Lisa used them), and the whole system never worked properly, requiring constant servicing. The fact that they were on the Lisa at all made little sense in light of the new 3.5″ “micro-floppy” standard just introduced by Sony. Those disks were reliable, inexpensive, and easily available, everything Twiggy was not, while matching or exceeding all of Twiggy’s other specifications. They were in fact so good that they would remain a computer-industry staple for the next twenty years. But Apple had poured millions into the Twiggy boondoggle during the previous several years of constant internal confusion, and they were determined to use it.

And then there was the price. Trip Hawkins’s marketing-requirements document from back in 1980 had presciently warned that the Lisa must be priced at less than $5000 to have a chance of making an impact. Somewhere along the way, however, that bit of wisdom had been lost. The Lisa debuted at no less than $10,000, a figure that in 1983 dollars could buy you a pretty nice new car. Given its extreme price and the resulting necessity that it be marketed exclusively to big corporate customers, it’s tough to say whether the Lisa can really be considered a PC in the mold of the Apple II and IBM PC at all. It utterly lacked the democratic hobbyist spirit that had made the Apple II such a success. Software could be written for the Lisa only by yoking two Lisas together, one to host the program being written and the other to be used for writing it with the aid of an expensive toolkit available only from Apple. It was a barrier to entry so high that the Lisa was practically a closed system like the Xerox Star, confined to running only the software that Apple provided. Indeed, if Lisa had come from a company not known exclusively as a PC maker — like, say, Xerox — perhaps Lisa would have been taken by the trade press as a workstation computer or an “office information system” in the vein of the Star. Yet the Lisa also came up short in several key areas in comparison to the only marginally more expensive Star. It lacked the Star’s networking support, meaning that a key element of PARC’s office of the future was missing. And it lacked a laser printer. In its stead Apple offered a dot-matrix model it had jointly developed with C. Itoh. Like too much else about the Lisa, it turned out slow, clunky, and unreliable; documents on paper were always a disappointment after viewing them on the Lisa’s crisp screen. Any office manager willing to spend the cash for the Lisa might very well have been better off splashing out some extra for the Star (not that many people were buying either system).

Finally, there was the Macintosh problem. Thanks to their internal confusion and the engine of chaos that was Steve Jobs, Apple had two 68000-based computers sporting mice, GUI-based operating systems, and high-resolution bitmapped monochrome displays. Best of all, the two computers were completely incompatible with each other. Seeing his opportunity, Jobs started leaking like a sieve about Apple’s next computer even as he dutifully demonstrated the Lisa. Virtually every preview or review thus concluded with a mention of the rumors about something called “Mackintosh,” which promised to do just about everything Lisa did for a fraction of the price. Apple’s worst enemy could hardly have come up with a better scheme to squelch the Lisa’s sales.

The rest of the Lisa story is largely that of an increasingly desperate Apple struggling to breathe life back into her. In September they dropped the price to $8200, or $7000 for just the machine and the opportunity to order the applications à la carte rather than as a mandatory bundle. By now Apple’s shares had dropped back to $27, less than they had been to start the year. At year’s end they had sold just 15,000 Lisas, down from estimates of 50,000 in those heady days of June.

Lisa 2. The Twiggy drives have been replaced by a single 3.5" drive, and the hard drive is now internal.

Lisa 2. The Twiggy drives have been replaced by a single 3.5″ drive, and the hard drive is now internal.

In January of 1984 Apple released a much-needed revised model, Lisa 2, which replaced the Twiggy drives with 3.5″ models. Price was now in the range of $4000 to $5500. But Macintosh, now also released at last, well and truly stole the Lisa’s thunder yet again. The last of the Lisas were repackaged with a layer of emulation software as the Macintosh XL in January of 1985, marking the end of the Lisa nameplate. Sales actually picked up considerably after this move, as the price dropped again and the XL was still more advanced in many ways than the current “real” Macintosh. Still, the XL marked the end of the line for Lisa technology; the XL was officially discontinued on April 29, 1985, just less than two years after the first Lisa had rolled off the production line. In the end Apple sold no more than 60,000 Lisas and Macintosh XLs in total.

The Lisa was in many ways half-baked, and its commercial fate, at least in hindsight, is perfectly understandable. Yet its soft power was immense. It showed that a sophisticated, multitasking operating system could be done on a microcomputer, as could a full GUI. The latter achievement in particular would have immediate repercussions. While it would still be years before most average users would have machines built entirely around the PARC/Lisa model of computing, there was much about the Lisa that was implementable even on the modest 8-bit machines that would remain the norm in homes for years to come. Lisa showed that software could be more visual, easier to use, friendlier even on those machines. That new attitude would begin to take root, and nowhere more so than in the ostensible main subject of this blog which I’ve been neglecting horribly lately: games. We’ll begin to see how the Lisa way trickled down to the masses in my next article, which I promise will be about games again at last.

On December 14, 1989, Xerox finally got around to suing Apple for allegedly ripping off their PARC innovations, thus prompting the joke that Xerox can’t even sue you on time. With the cat so well and thoroughly out of the bag by this point, the suit was dismissed a few months later.

(As with most aspects of Apple history, there’s enough material available on the Lisa project in print and on the Internet for days of reading. A particularly fascinating source, because it consists entirely of primary-source documents, is the Lisa directory on Asimov.net.)

 
 

Tags:

Xerox PARC

One day in 1962 J.C.R. Licklider, head of the Defense Department’s Information Processing Techniques Office and future Infocom co-founder, ran into a young man named Robert Taylor at an intra-government conference on emerging computer technologies. Lick was seventeen years older than Taylor, but the two found they had much in common. Both had studied psychology at university, with a special research interest in the psychology of human hearing. Both had moved on to computers, to become what we might today call user-interface specialists, studying the ways that the human mind receives and processes information and how computers might be designed to work in more intuitive accord with their masters. Both were humanists, more concerned with that amorphous thing we’ve come to call the user experience than with the bits and bytes that obsessed the technicians and engineers around them. And both were also from the South — Lick from Missouri, Taylor from Texas — and spoke in a corresponding slow drawl that often caused new acquaintances to underestimate them. A friendship was formed.

Robert Taylor, as photographed by Annie Leibowitz in 1972 for a Rolling Stone feature article

Robert Taylor, as photographed by Annie Leibowitz in 1972 for a Rolling Stone feature article

Taylor was working at the time at NASA, having been hired there in the big build-up that followed President Kennedy’s Moon-before-the-decade-is-out speech to work on simulators. Rather astonishingly considering the excitement building for the drive to the Moon, Taylor found himself increasingly dissatisfied there. He wasn’t content working on the margins of even as magnificent an endeavor as this one. Fueled by his conversations with Lick about the potential of computers, he wanted to be at the heart of the action. In 1964 he got his wish. Stepping down as head of IPTO, Lick recommended that Ivan Sutherland be made his replacement, and that Taylor be made Sutherland’s immediate deputy. Barely two years later Sutherland himself stepped down, making the 34-year-old Taylor head of the most far-reaching, well-funded computer-research grant program in the country.

By this time Taylor had long ago come to share Lick’s dream of computers as more than just calculating and tabulating machines. They had the potential to become personal, interactive companions that would not replace human brainpower (as many of the strong AI proponents dreamed) but rather complement, magnify, and transmit it. Taylor put his finger on the distinction in a later interview: “I was never interested in the computer as a mathematical device, but as a communications device.” He and Lick together published a manifesto of sorts in 1968 that still stands as a landmark in the field of computer science, the appropriately named “The Computer as a Communications Device.” They meant that literally as well as figuratively: it was Taylor who initiated the program that would lead to the ARPANET, the predecessor to the modern Internet.

The first mouse, created at SRI circa 1964

The first mouse, created at SRI circa 1964

One of Taylor’s favorites amongst his stable of researchers became Doug Engelbart, who seemed almost uniquely capable of realizing his and Lick’s concept of a new computing paradigm in actual hardware. While developing an early full-screen text editor at the Stanford Research Institute, Engelbart found that users complained of how laborious it was to slowly move the cursor around the screen using arrow keys. To make it easier, he and his team hollowed out a small block of wood, mounting two mechanical wheels attached to potentiometers in the bottom and a button on top. They named it a “mouse,” because with the cord trailing out of the back to connect it with a terminal that’s sort of what it looked like. The strange, homemade-looking gadget was crude and awkward compared to what we use today. Nevertheless, his users found it a great improvement over the keyboard alone. The mouse was just one of many innovations of Engelbart and his team. Their work climaxed in a bravura public demonstration in December of 1968 which first exposed the public to not only the mouse but also the concepts of networked communication, multimedia, and even the core ideas behind what would become known as hypertext. Engelbart pulled out all the stops to put on a great show, and was rewarded with a standing ovation.

But to some extent by this time, and certainly by the time the ARPANET first went live the following year, the atmosphere at ARPA itself was changing. Whereas earlier Taylor was largely left to invest his resources in whatever seemed to him useful and important, the ever-escalating Vietnam War was bringing with it both tightening research budgets and demands that all research be “mission-focused” — i.e., tailored not only to a specific objective, but to a specific military objective at that. Further, Taylor found himself more and more ambivalent about both the war itself and the idea of working for the vast engine that was waging it. After being required to visit Vietnam personally several times to sort out IT logistics there, he decided he’d had enough. He resigned from ARPA at the end of 1969, accepting a position with the University of Utah, which was conducting pioneering (and blessedly civilian) research in computer graphics.

He was still there a year later when an old colleague whose research had been partially funded through ARPA, George Pake, called him. Pake now worked for Xerox Corporation, who were in the process of opening a new blue-sky research facility that he would head. One half of its staff and resources would be dedicated to Xerox’s traditional forte, filled with chemists and physicists doing materials research into photocopying technology. The other half, however, would be dedicated to computer technology in a bid to make Xerox not just the copy-machine guys but the holistic architects of “the office of the future.” Eager to exploit Taylor’s old ARPA connections, which placed him on a first-name basis with virtually every prominent computer scientist in the country, Pake offered Taylor a job as an “associate manager” — more specifically, as a sort of recruiter-in-chief and visionary-in-residence — in the new facility in Palo Alto, California, just outside Stanford University. Bored already by Mormon-dominated Salt Lake City, Taylor quickly accepted.

The very idea of a facility like Xerox’s Palo Alto Research Center feels anachronistic today, what with its open-ended goals and dedication to “pure” research. When hired to run the place, Pake frankly told Xerox that they shouldn’t expect any benefits from the research that would go on there for five to ten years. In that he wasn’t entirely correct, for PARC did do one thing for Xerox immediately: it gave them bragging rights.

Xerox was a hugely profitable company circa 1970, but relatively new to the big stage. Founded back in 1906 as The Haloid Photographic Company, they had really hit the big time only in 1960, when they started shipping the first copy machine practical for the everyday office, the Xerox 914. Now they were eager to expand their empire beyond copy machines, to rival older giants like IBM and AT&T. One part of doing so must be to have a cutting-edge research facility of their own, like IBM’s Thomas J. Watson Research Center and the fabled Bell Labs. Palo Alto was chosen as the location not so much because it was in the heart of Silicon Valley as because it was a long way from the majority of Xerox’s facilities on the other coast. Like its inspirations, PARC was to be kept separate from any whiff of corporate group-think or practical business concerns.

Once installed at PARC, Taylor started going through his address book to staff the place. In a sense it was the perfect moment to be opening such a facility. The American economy was slowing, leaving fewer private companies with the spare change to fund the sort of expensive and uncertain pure research that PARC was planning. Meanwhile government funding for basic computer-science research was also drying up, due to budget squeezes and Congressional demands that every project funded by ARPA must have a specific, targeted military objective. The salad days of Taylor’s ARPA reign, in other words, were well and truly over. It all added up to a buyer’s market for PARC. Taylor had his pick of a large litter of well-credentialed thinkers and engineers who were suddenly having a much harder time finding interesting gigs. Somewhat under the radar of Pake, he started putting together a group specifically tailored to advance the dream he shared with Lick and Engelbart for a new, more humanistic approach to computing.

One of his early recruits was William English, who had served as Engelbart’s right-hand man through much of the previous decade; it was English who had actually constructed the mouse that Engelbart had conceived. Many inside SRI, English not least among them, had grown frustrated with Engelbart, who managed with an air of patrician entitlement and seemed perpetually uninterested in building upon the likes of that showstopping 1968 demonstration by packaging his innovations into practical forms that might eventually reach outside the laboratory and the exhibit hall. English’s recruitment was the prelude to a full-blown defection of Engelbart’s team; a dozen more eventually followed. One of their first projects was to redesign the mouse, replacing the perpendicularly mounted wheels with a single ball that allowed easier, more precise movement. That work would be key to much of what would follow at PARC. It would also remain the standard mouse design for some thirty years, until the optical mouse began to phase out its older mechanical ancestor at last.

Alan Kay

Alan Kay

Taylor was filling PARC with practical skill from SRI and elsewhere, but he still felt he lacked someone to join him in the role of conceptual thinker and philosopher. He wanted someone who could be an ally against the conventional wisdom — held still even by many he had hired — of computers as big, institutional systems rather than tools for the individual. He therefore recruited Alan Kay, a colleague and intellectual soul mate from his brief tenure at the University of Utah. Kay dreamed of a personal computer with “enough power to outrace your senses of sight and hearing, enough capacity to store thousands of pages, poems, letters, recipes, records, drawings, animations, musical scores, and anything else you would like to remember and change.” It was all pretty vague stuff, enough so that many in the computer-science community — including some of those working at PARC — regarded him as a crackpot, a fuzzy-headed dreamer slumming it in a field built on hard logic. Of course, they also said the same about Taylor. Taylor decided that Kay was just what he needed to make sure that PARC didn’t just become another exercise in incremental engineering. Sure enough, Kay arrived dreaming of something that wouldn’t materialize in anything like the form Kay imagined it until some two decades later. He called it the Dynabook. It was a small, flat rectangular box, about 9″ X 12.5″, which flipped open to reveal a screen and keyboard on which one could read, write, play, watch, and listen using media of one’s own choice. Kay was already describing a multimedia laptop computer — and he wasn’t that far away from the spirit of the iPad.

Combining the idealism of Taylor and Kay with the practical knowledge of their engineering staff and at least a strong nod toward the strategic needs of their parent corporation, PARC gradually refined its goal to be the creation of an office of the future that could hopefully also be a stepping stone on the path to a new paradigm for computing. Said office was constructed during the 1970s around four technologies developed right there at PARC: the laser printer; a new computer small enough to fit under a desk and possessed of almost unprecedented graphical capabilities; practical local-area networking in the form of Ethernet; and the graphical user interface (GUI). Together they all but encapsulated the face that computing would assume twenty years later.

Gary Starkweather's laser printer

Gary Starkweather’s laser printer

Of the aforementioned technologies, the laser printer was the most immediately, obviously applicable to Xerox’s core business. It’s thus not so surprising that its creator, Gary Starkweather, was one of the few at PARC to have been employed at Xerox before the opening of the new research facility. Previous computer printers had been clanking, chattering affairs that smashed a limited set of blocky characters onto a continuous feed of yellow-tinged fan-fold paper. They were okay for program listings and data dumps but hardly acceptable for creating business correspondence. In its original implementation Starkweather’s laser printer was also ugly, an unwieldy contraption sprouting wires out of every orifice perched like a huge parasite atop a Xerox copy machine whose mechanisms it controlled. It was, however, revolutionary in that it treated documents not as a series of discrete characters but as a series of intricate pictures to be reproduced by the machinery of the copier it controlled. The advantages of the new approach were huge. Not only was the print quality vastly better, but it appeared on crisp white sheets of normal office paper. Best of all, it was now possible to use a variety of pleasing proportional fonts to replace the ugly old fixed-width characters of the line printers, to include non-English characters like umlauts and accents, to add charts, graphs, decorative touches like borders, even pictures.

Xerox Alto

Xerox Alto

The new computer was called the Alto. It was designed to be a personal computer, semi-portable and supporting just one user, although since it was not built around a microprocessor it was not technically a microcomputer like those that would soon be arriving on the hobbyist market. The Alto’s small size made it somewhat unusual, but what most set it apart was its display.

Most computers of this period — those that were interactive and thus used a display at all, that is — had no real concept of managing a display. They rather simply dumped their plain-text output, fire-and-forget fashion, to a teletype printer or terminal. (For example, it was on the former devices that the earliest text adventures were played, with the response to each command unspooling onto fan-folded paper.) Even more advanced systems, like the full-screen text editors with which Engelbart’s team had worked, tracked the contents of the screen only as a set of cells, each of which could contain a single fixed-width ASCII character; no greater granularity was possible, nor shapes that were not contained in the terminal’s single character set. Early experiments with computer graphics, such as the legendary Spacewar! game developed at MIT in the early 1960s, used a technique known as vector graphics, in which the computer manually controlled the electron gun which fired to form the images on the screen. A picture would be stored not as a grid of pixels but as a series of instructions — the sequence of strokes used to draw it on the display. (This is essentially the same technique as that developed by Ken Williams to store the graphics for On-Line’s Hi-Res Adventure line years later.) Because the early vector displays had no concept of display memory at all, a picture would have to be traced out many times per second, else the phosphors on the display would fade back to black. Such systems were not only difficult to program but much too coarse to allow the intricacies of text.

The Alto formed its display in a different way — the way the device you’re reading this on almost certainly does it. It stored the contents of its screen in memory as a grid of individual pixels, known as a bitmap. One bit represented the on/off status of one pixel; a total of 489,648 of them had to be used to represent the Alto’s 606 X 808 pixel black-and-white screen. (The Alto’s monitor had an unusual portrait orientation that duplicated the dimensions of a standard 8 1/2″ X 11″ sheet of office paper, in keeping with its intended place as the centerpiece of the office of the future.) This area of memory, often called a frame buffer during these times when it was a fairly esoteric design choice, was then simply duplicated onto the monitor screen by the Alto’s video hardware. Just as the laser printer saw textual documents as pictures to be reproduced dot by dot, the Alto saw even its textual displays in the same way. This approach was far more taxing on memory and computing power than traditional approaches, but it had huge advantages. Now the user needed no longer be restricted to a single font; she could have a choice of type styles, or even design her own. And each letter needed no longer fit into a single fixed-size cell on the screen, meaning that more elegant and readable proportional fonts were now possible.

Amongst many other applications, the what-you-see-is-what-you-get word processor was born on the Alto as a direct result of its bitmapped display. A word processor called Gypsy became the machine’s first and most important killer app. Using Gypsy, the user could mix fonts and styles and even images in a document, viewing it all onscreen exactly as it would later look on paper, thanks to the laser printer. The combination was so powerful, went so far beyond what people had heretofore been able to expect of computers or typewriters, that a new term, “desktop publishing,” was eventually coined to describe it. Suddenly an individual with an Alto and a laser printer could produce work that could rival — in appearance, anyway — that of a major publishing house. (As PARC’s own David Liddle wryly said, “Before that, you had to have an article accepted for publication to see your words rendered so beautifully. Now it could be complete rubbish, and still look beautiful.”) Soon even the professionals would be abandoning their old paste boards and mechanical typesetters. Ginn & Co., a textbook-publishing subsidiary of Xerox, became the first publishers in the world to go digital, thanks to a network of laser printers and Altos running Gypsy.

Ethernet

Speaking of which: the Ethernet network was largely the creation of PARC researcher Robert Metcalfe. Various networking schemes had been proposed and sometimes implemented in the years before Ethernet, but they all carried two big disadvantages: they were proprietary, limited to the products of a single company or even to a single type of machine; and they were fragile, prone to immediate failure if the smallest of their far-flung elements should break down. Ethernet overcame both problems. It was a well-documented standard that was also almost absurdly simple to implement, containing the bare minimum needed to accomplish the task effectively and absolutely nothing else. This quality also made it extremely reliable, as did its decentralized design that made it dependent on no single computer on the network to continue to function. Unlike most earlier networking systems, which relied upon a charged cable like that used by the telephone system, Ethernet messages could pass through any passive conductive medium, including uncharged copper wire of the sort sold by the ream in any hardware store. The new protocol was simpler, cheaper, safer, and more reliable than anything that had come before.

Like so much else at PARC, Ethernet represented both a practical step toward the office of the future and a component of Taylor’s idealistic crusade for computers as communications devices. In immediate, practical terms, it let dozens of Altos at PARC or Ginn & Co. share just a few of Starkweather’s pricy laser printers. In the long run, it provided the standard by which millions of disparate devices could talk to one another — the “computer as a communications device” in its purest terms. Ethernet remains today one of the bedrock technologies of the hyper-connected world in which we live, a basic design so effective at what it does that it still hasn’t been improved upon.

A GUI file manager running on an Alto

A GUI file manager running on an Alto

The GUI was the slowest and most gradual of the innovations to come to PARC. When the Alto was designed, Engelbart and English’s mouse was included. However, it was pictured as being used only for the specialized function for which they had originally designed it: positioning the cursor within a text document, a natural convenience for the centerpiece of the office of the future. But then Alan Kay and his small team, known as the “lunatic fringe” even amongst the others at PARC, got their hands on some Altos and started to play. Unlike the hardcore programmers and engineers elsewhere at PARC, Kay’s team had not been selected on the basis of credentials or hacking talent. Kay rather looked for people “with special stars in their eyes,” dreamers and grand conceptual thinkers like him. Any technical skills they might lack, he reasoned, they could learn, or rely on other PARC hackers to provide; one of his brightest stars was Diana Merry, a former secretary for a PARC manager who just got what Kay was on about and took to coming to his meetings. Provided the Alto, the closest they could come to Kay’s cherished Dynabook, they went to work to make the technology sing. They developed a programming language called Smalltalk that was not only the first rigorously object-oriented language in history, the forerunner to C++, Java, and many others, but also simple enough for a grade-school kid to use. With Smalltalk they wrote a twelve-voice music synthesizer and composer (“Twang”), sketching and drawing programs galore, and of course the inevitable games (a networked, multiplayer version of the old standard Star Trek was a particular hit). Throughout, they re-purposed the mouse in unexpected new ways.

Kay and his team realized that many of the functions they were developing were complementary; it was likely that users would want to do them simultaneously. One might, for example, want to write an instructional document in a text editor at the same time as one edited a diagram meant for it in a drawing program. They developed tools to let users do this, but ran into a problem: the Alto’s screen, just the size of a single sheet of paper, simply couldn’t contain it all. Returning yet again to the idea of the office of the future, Kay asked what people in real offices did when they ran out of space on their desk. The answer, of course, was that they simply piled the document they were using at that instant on top of the one they weren’t, then proceeded to flip between the documents as needed. From there it all just seemed to come gushing out of Kay and his team.

The Alto's Smalltalk windowing system in mature form

The Alto’s Smalltalk windowing system in mature form

In February of 1975 Kay called together much of PARC, saying he had “a few things to show them.” What they saw was a rough draft of the graphical user interface that we all know today: discrete, overlapping windows; mouse-driven navigation; pop-up menus. In a very real way it was the fruition of everything they had been working on for almost five years, and everything Taylor and Kay had been dreaming of for many more. At last, at least in this privileged research institution, the technology was catching up to their dreams. Now, not quite two years after the Alto itself had been finished, they knew what it needed to be. Kay and the others at PARC would spend several more years refining the vision, but the blueprint for the future was in place already in 1975.

Xerox ultimately received little of the benefit they might have from all this visionary technology. A rather hidebound, intensely bureaucratic management structure never really understood the work that was going on at PARC, whose personnel they thought of as vaguely dangerous, undisciplined and semi-disreputable. Unsurprisingly, they capitalized most effectively on the PARC invention closest to the copier technology they already understood: the laser printer. Even here they lost years to confusion and bureaucratic infighting, allowing IBM to beat them to the market with the world’s first commercial laser printer. However, Starkweather’s work finally resulted in the smaller, more refined Xerox 9700 of 1977, which remained for many years a major moneymaker. Indeed, all of the expense of PARC was likely financially justified by the 9700 alone.

Still, the bulk of PARC’s innovations went comparatively unexploited. During the late 1970s Xerox did sell Alto workstations to a small number of customers, among them Sweden’s national telephone service, Boeing, and Jimmy Carter’s White House. Yet the commercialization of the Alto, despite pleading from many inside PARC who were growing tired of seeing their innovations used only in their laboratories, was never regarded by Xerox’s management as more than a cautious experiment. With a bit of corporate urgency, Altos could easily have been offered for sale well before the trinity of 1977 made its debut. While a more expensive machine designed for a very different market, a computer equipped with a full GUI for sale before the likes of the Apple II, TRS-80, and PET would likely have dramatically altered the evolution of the PC and made Xerox a major player in the PC revolution. Very possibly they might have ended up playing a role similar to that of IBM in our own timeline — only years earlier, and with better, more visionary technology.

The Xerox Star

The Xerox Star

Xerox’s most concerted attempt to exploit the innovations of PARC as a whole came only in 1981, in the form of the Xerox Star “office information system.” The product of an extended six years of troubled development shepherded to release against the odds at last by ex-PARCer David Liddle, the Star did it all, and often better than it had been done inside PARC itself. The innovations of Kay and his researchers — icons, windows, scroll bars, sliders, pop-up menus — were refined into the full desktop metaphor that remains with us today, the perfect paradigm for the office of the future. Also included in each Star was a built-in Ethernet port to link it with its peers as well as the office laser printer. The new machine represented the commercial fruition of everything PARC had been working toward for the last decade.

The desktop metaphor in full flight on the Star

The desktop metaphor in full flight on the Star

Alas, the Star was a commercial failure. Its price of almost $17,000 per workstation meant that assembling a full office of the future could easily send the price tag north of $100,000. It also had the misfortune to arrive just a few months before the IBM PC, a vastly simpler, utilitarian design that lacked the Star’s elegance but was much cheaper and open to third-party hardware and software. Marketed as a very unconventional piece of conventional office equipment rather than a full-fledged PC, the Star was by contrast locked into the hardware and software Xerox was willing to provide. In the end Xerox managed to sell only about 30,000 of them — a sad, anticlimactic ending to the glory days of innovation at PARC. (The same year that the Star was released Robert Taylor left PARC, taking the last remnants of his original team of innovators with him. By this time Alan Kay was already long gone, driven away by management’s increased demands for practical, shorter-term projects rather than leaps of faith.)

Like the Alto, the Star’s influence would be far out of proportion to the number produced. It is after all to this machine that we owe the ubiquitous desktop metaphor. If anything, the innovations of the Star tend to go somewhat under-credited today in the understandable rush to lionize the achievements inside PARC proper. Perhaps this is somewhat down to Xerox’s dour advertising rhetoric that presented the Star as “just” an “office administration assistant”; those words don’t exactly smack of a machine to change the world.

Oddly, the Star’s fate was just the first of a whole string of similar disappointments from many companies. The GUI and the desktop metaphor were concepts that seemed obviously, commonsensically better than the way computers currently worked to just about everyone who saw them, but it would take another full decade for them to remake the face of the general-purpose PC. Those years are littered with failures and disappointments. Everyone knew what the future must be like, but no one could quite manage to get there. We’ll look at one of the earliest and most famous of these bumps on the road next time.

(Despite some disconcerting errors of fact about the computing world outside the laboratory, Michael A. Hiltzik’s Dealers of Lightning is the essential history of Xerox PARC, and immensely readable to boot. If you’re interested in delving into what went on there in far more detail than I can provide in a single blog post, it’s your obvious next stop.)

 
 

Tags:

Ring in the New (Blog Initiatives)

As most of you have probably already gathered, I tend to be pretty horrible at social networking and at self-promotion in general. But it’s a new year, and it seems to be a good moment to at least make a stab at embracing modernity. So, I’m rolling out a couple of new initiatives today that will hopefully help me as well as you.

First, you’ll notice that there’s now a little donation button on the sidebar to the right of this post. If you click it you’ll be taken to PayPal, where you can send me some money if you’d like. I frankly struggled a bit with myself before I made this move. I’ve always written here out of passion and a belief that the work I’m doing is really, genuinely important. Knowing that thousands of you are reading and enjoying what I write is a huge thrill in itself, one that almost feels like it ought to be enough. On the other hand, though, the time I spend researching and writing for the blog is time I can’t spend on other, paying projects. So, I just ask that you think about what you can afford and what you think this blog is worth, whether to you in personal enjoyment or — at the risk of sounding too grandiose — to posterity. Then maybe kick a little into the kitty, at whatever level and frequency seems appropriate to you. If you can’t afford to contribute right now, never fear; I’ll never restrict content to “subscribers” or anything of that sort. Nor will I bother to try to convince you that the blog’s survival depends on your donations; I love it too much, and will happily continue if I don’t get a cent. But if I should get a nice surprise from all you kind souls, that might just help me to justify spending more time on it — which means more frequent new posts for you to read.

Second, I’ve finally taken the big plunge and joined the Twittering classes. My virgin id there is DigiAntiquarian. I’ve had some of you recently asking me for a tweet when new posts go up here. At least as of now, that’s the main purpose for the account. If the WordPress plug-in I installed works correctly, this post should be the first to be broadcast. Fingers crossed!

With that administrative stuff taken care of, we’ll next week be turning away from the hardware manufacturers and back to the important games of 1983, starting with the arrival of a new publisher that’s still with us to this day. In the meantime, do check out the reborn SPAG Magazine, now edited by Dannii Willis, if you’re at all interested in modern interactive fiction. I was the editor for several years in an earlier life, and it makes me very happy to see my old baby return in such capable hands.

(Update 24 hours later: Thanks so much for the generosity many of you have already shown! And thanks also for your suggestions about better leveraging social media. I’ll have a think about what seems doable without cluttering up the site too badly.

In other news, I’ve made a change in plans which means that we won’t get back to games just quite yet. I’ve one more detour into computer-science history yet to take, and I now realize this is the best time for it. But I think it’s one hell of an interesting detour; hopefully you will too.)

 

Shiny and Exciting vs. Dull and Boring

Apple Computer has remained quite consistent about a number of things throughout all the years of their existence. One of the most striking of these is their complete disinterest in competing on the basis of price. The implicit claim has always been that Apple products are special, made for special people, even as the number of “special people” who buy them have paradoxically in recent years rivaled the numbers of the more plebeian sorts who favor the likes of Microsoft. As such, it’s only just and right that you have to pay a bit of a premium for them. That’s of course a policy every company would have if they could get away with it. The remarkable thing about Apple is that they’ve been able to do just that for several decades now. Partly that success comes down to the sheer dumb luck of coming along at the right moment to be anointed the face of the PC revolution, with an all-American story of entrepreneurship and a charismatic and photogenic young founder that the mainstream media couldn’t help but love. Partly it comes down to a series of products that reflected a real vision for the potentialities of computing as opposed to just offering more MB and MHz than what had come out the previous year. But, perhaps most of all, the world believed Apple was special because Apple believed Apple was special, with a sincere certainty that no PR firm could have faked. At its worst, this quality makes the company and the cult of fanatically loyal users that surrounds it insufferable in their smug insularity. At its best, it lets them change the world in ways that even their perpetual rival Microsoft never quite managed. Bill Gates had a practical instinct for knowing where the computer industry was going and how he could best position Microsoft to ride the wave, but Steve Jobs had a vision for how computer technology could transform life as ordinary people knew it. The conflict between this utopian vision, rooted in Apple’s DNA just as it was in that of Jobs the unrepentant California dreamer himself, and the reality of Apple’s prices that have historically limited its realization to a relatively affluent elite is just one of a number of contradictions and internal conflicts that make Apple such a fascinating study. Some observers perceive Apple as going through something of an identity crisis in this new post-Jobs era, with advertising that isn’t as sharp and products that don’t quite stand out like they used to. Maybe Apple, for thirty years the plucky indie band that belonged to the cool kids and the tastemakers, isn’t sure how to behave now that they’re on a major label and being bought by everyone. Their products are now increasingly regarded as just products, without the aura of specialness that insulated them from direct comparison on the strict basis of price and features for so long.

But that’s now. When the Home Computer Wars got started in earnest in 1982, Apple adopted a policy of, as a contemporaneous Compute! editorial put it, “completely ignoring the low-end market,” positioning themselves as blissfully above such petty concerns as price/feature comparison charts. Even for a company without Apple’s uniquely Teflon reputation that wasn’t necessarily a bad policy to follow. As Texas Instruments and Atari would soon learn, no one had a prayer of beating Jack Tramiel’s vertically-integrated Commodore in a price war. At the same time, though, the new Commodore 64 represented in its 64 K of memory a significant raising of the bar for any everyday, general-purpose PC. In that sense Apple did have to respond, loath as they would have been to publicly admit that the likes of Commodore could have any impact on their strategy. The result, the Apple IIe, was the product not only of the changes wrought to the industry by the Commodore 64 but also of the last several chaotic years inside Apple’s Cupertino, California, headquarters.

Apple’s superb public image could somewhat belie the fact that through 1982 they had managed to release exactly one really successful computer, the Apple II Plus of 1979. Of the machines prior to the II Plus, the Apple I (1976) had been a mere hobbyist endeavor assembled by hand in Jobs’s garage, and had sold in about the quantities you might expect for such a project; the original Apple II (1977) an impressive proof of concept that was overshadowed by Radio Shack’s cheaper, better distributed TRS-80. It was the 48 K II Plus — mated to Steve Wozniak’s last great feat of minimalist engineering, the Disk II system, and with the PC industry’s first killer app, VisiCalc, in tow — that made Apple. After it… well, there’s quite a tale.

Mike Markkula

Mike Markkula

Michael Scott

Michael Scott

It’s often forgotten that Apple’s early history isn’t just the story of the two kids who started the company in a garage. Right after the two Steves came the two Mikes. Apple’s third and fourth employees, Mike Markkula and Michael Scott, were both older Silicon Valley insiders with money and resumes listing companies like Intel and Fairchild — about as Establishment as you could get. With his role of visionary-in-chief and public spokesman not yet clearly defined, Jobs was the odd man out among the original four, bereft of both Woz’s technical genius and the connections, deep pockets, and business acumen of Markkula and Scott. Tellingly, when Scott issued identification badges for the first time he made Woz, the architect of all of the company’s projects so far and presumed key to their future success, Employee #1, until Jobs’s endless whining convinced him to make him Employee #0 as a compromise. Markkula and Scott managed in remarkably short order to institute a very Establishment bureaucratic structure inside the offices of the supposedly freewheeling Apple. Jobs bounced about the org chart, in Frank Rose’s words “like a feral child” (like the young Bill Gates, the young Steve Jobs was not always big on personal hygiene), leaving chaos in his wake.

Within Apple the II Plus was regarded as the logical end of the road for the Apple II, the ultimate maturation of Woz’s original template. By the time it was released all three of the others were souring on Woz himself. Markkula and Scott found Woz’s lone-wolf style of high-wire engineering to be incompatible with their growing business and its associated bureaucracy, while Jobs resented Woz’s status as the father of the Apple II and desperately wanted to prove himself by creating something of his own, without Woz’s involvement. And so the three came to a tacit agreement to ease Woz aside. Woz himself, whose native guilelessness and naivete are so extreme they can seem almost disingenuous to more cynical souls, simply let it happen without seeming to even realize what was going on. It was decided to divide Apple’s research and development into two projects. One, codenamed Sara, would be a practical attempt to create a logical successor to the Apple II that would be more expensive but also more appealing to businesses and high-end users. The other, codenamed Lisa, would be built around a powerful new processor just released by Motorola, the 68000. As a long-term, blue-sky project, Lisa’s design and ultimate goals would remain quite amorphous for quite some time while most of Apple’s day-to-day effort went into Sara.

The Sara would be designed around the same MOS 6502 CPU as the Apple II, but clocked to 1.8 MHz rather than 1 MHz. A bank-switching scheme would let the machine ship with a full 128 K of memory, with more expansion possible. The graphics would be a dramatic improvement on the Apple II, with higher resolutions and none of the weird color restrictions that could be such a challenge for programmers. The standard text display would now show the 80 columns so critical to word processing and many other business tasks. And replacing the simple ROM-housed BASIC environment of the Apple II would be a more sophisticated operating system booted off of disk. To make sure everyone got the message about this latter, they would even call it the “Sophisticated Operating System,” or, rather unfortunately in light of later developments, SOS. (In one of the more infuriating examples of the endemic cutesiness that has often afflicted Apple, they insisted that “SOS” be pronounced as “Applesauce,” but at least on this occasion most people ended up having none of it.)

It all sounded like a great plan on paper. However, the new group of engineers hired to replace Woz soon had their hands full dealing with Jobs, whose vision for Sara seemed to change by the day. Determined to make Sara “his” machine, he demanded things that were often incompatible with the realities of computer technology circa 1979. The new machine must have a certain external size and shape, leaving the engineers to cram chips into spaces with inadequate ventilation. When they requested an internal fan, Jobs vetoed the idea; he wanted a sleek, elegant, quiet device, not something that sounded like a piece of industrial equipment. In addition to dealing with Jobs, the engineers were also put under enormous pressure to finish Sara, now to be called the Apple III, before Apple’s IPO in December of 1980. Cutting corners all the way, they managed to start shipping the machine just a few weeks before. In retrospect it was a lucky break that they cut it so close, because it meant that reports of what a mess the Apple III was hadn’t filtered up in enough strength or quantity by the time of the IPO to affect it.

The Apple III

The Apple III

Just as the engineers had feared, Apple IIIs started overheating over long use. As they got too hot, chips would expand and come out of their sockets. Apple’s customer-support people were soon reduced to asking angry callers to please pick up their new $6000 wonder-computers and drop them back onto the desk in the hope that this would re-seat the chips. In addition to the overheating, a whole host of other problems cropped up, from shoddy cables to corroding circuit boards. Apple struggled to deal with the problems for months, repairing or replacing tens of thousands of machines. Unfortunately, the replacements were often as unreliable as the originals. Disgusted by the fiasco and by the company culture he felt had led to it (by which he meant largely Markkula and Jobs), Michael Scott quit Apple in July of 1981. With Woz having recognized there was little role left for him at Apple and having started on the first of a series of extended leaves of absence, two of the four original partners were effectively now gone. Far from claiming the Apple III as his equivalent to Woz’s baby the Apple II, Jobs quickly distanced himself from those left struggling to turn this particular sow’s ear into a silk purse. Randy Wigginton, Employee #6, put the situation colorfully: “The Apple III was kind of like a baby conceived during a group orgy, and [later] everybody had this bad headache and there’s this bastard child, and everyone says, ‘It’s not mine.'” Jobs moved on to the Lisa project, which was now really ramping up and becoming Apple’s big priority, not to mention his own great hope for making his mark. When he wore out his welcome there, he moved on yet again, to a new project called Macintosh.

In the end the Apple III remained in production until 1984 despite selling barely 100,000 units in total over that period. Within Apple it came to be regarded as a colossal missed opportunity which could have sewn up the business market for them before IBM entered with their own PC. That strikes me as doubtful. While the Apple III was unquestionably one of the most powerful 6502-based computers ever made on paper and gradually grew to be quite a nice little machine in real life as the remaining dogged engineers gradually solved its various problems, the 8088-based IBM PC was superior in processing power if nothing else, as well as having the IBM logo on its case. The Apple III was certainly damaging to Apple financially, but it could have been worse. Apple’s position as press favorites and the continuing success of the II Plus insulated them surprisingly well; relatively few lost faith in that special Apple magic.

Indeed, as Apple poured money into the Apple III and now into the ever more grandiose Lisa project, the II Plus just kept on selling, long after Apple’s own projections had it fading away. The fact is that the only thing that sustained Apple through this period of flailing about was the II Plus. If Apple’s own projections about its sales had been correct, there wouldn’t be an Apple still around today. Luckily, users simply refused to let it go, and their evangelization of the machine and the many third-party hardware and software makers who supported it kept it selling incredibly strongly even when Apple themselves seemed that they could hardly care less about it. Thanks to Woz’s open design, they didn’t have to; the machine seemed almost infinitely malleable and expandable, an open canvas for others to paint upon. Far from appreciating their loyalty or even simply Apple’s good fortune, Jobs grew more and more frustrated at the Apple II’s refusal to die. He called those parts of the company still dedicated to the line the “dull and boring” divisions, the people who worked in them “Clydesdales” because they were so slow and plodding. He delighted in wandering over to the remnants of the Apple II engineering department (now stuck away in an ugly corner of Apple’s growing campus) to tell everyone how shit all of their products were, seemingly oblivious to the fact that those same products were the only things funding his grander vision of computing.

Dull and boring or not, by 1982 it was becoming obvious that Apple was tempting fate by continuing to leave their only commercially viable computer unimproved. Despite the flexibility of its basic design, some core aspects of the machine were now almost absurdly primitive. For instance, it still had no native support for lower-case letters. And now the much cheaper Commodore 64 was about to be released with a number of features that put the old II Plus to shame. At virtually any moment sales could collapse as users flocked at last to the 64 or something else, taking Apple as a company down with them. Apple may not have been interested in price wars, but it was past time that they reward Apple II users with some sort of sop for their loyalty. An engineer named Walt Brodener turned down a spot on Jobs’s growing Macintosh team to make a next-generation Apple II, out of love of the platform and a desire to sustain the legacy of Woz.

Back in 1980, Brodener and Woz had worked on a project to dramatically reduce the production cost of the II Plus by squeezing the whole design into vastly fewer chips. But then management, caught in the grip of Apple III fever, had cancelled it. The Apple II line, they had reasoned, wouldn’t sell for long enough that the savings would justify the cost of tooling up for a whole new design. Brodener now started again from the old plan he’d put together with Woz, but two more years of ever improving chip-making technology let him take it much further. In the end he and his team reduced the chip count from 120 to an incredible 31, while significantly increasing the computer’s capabilities. Using a bank-switching scheme even more intricate than that of the Commodore 64, they boosted standard memory to 64 K, while also making it possible to further expand the machine to 128 K or (eventually) more. An 80-column text display was now standard, with lower-case letters also making an appearance at long last. Just as the machine entered production, Brodener and his team realized that the changes they had made to implement 80-column text would also allow a new, higher resolution graphics mode with only a few minor changes to the motherboard. Accordingly, all but the first batch of machines shipped with a “double-hi-res” mode. It came with even more caveats and color restrictions than the standard hi-res mode, but it ran at a remarkable 560 X 192. Finally, the new design was also cooler and more reliable. Brodener and team managed all of this while still keeping the new machine 99.9% compatible with software written for the old.

The Apple IIe

The Apple IIe

Known as the “Super II” during its development phase, Apple eventually settled on calling the new machine the IIe, as in “Enhanced.” Jobs predictably hated it, finding its big, chunky case with its removable top and row of hacker-friendly expansion slots anathema to his own ideas of computers as elegantly closed designs that were complete in themselves. He sniped at the IIe from its genesis right through its commercial debut and for years afterward. He had plenty of time to do so, because the IIe proved to be a spectacular success right from its release in January of 1983. At last Apple had a true successor to the II Plus, albeit in the form they had least expected and, truth be told, least desired. The Lisa, meanwhile, which shipped at last some six months after the IIe, turned into yet another major disappointment, not recouping a fraction of its huge development cost. Much to Jobs’s chagrin, it seemed that Apple simply couldn’t field a successful product unless it had a “II” on the case.

But what a success the IIe turned out to be, a veritable dream product for any company. Despite its much reduced manufacturing cost, Apple didn’t reduce the new machine’s selling price at all. Why should they, if people were still happy to buy at the old prices? They sold the IIe to distributors at three times their cost to make them, an all but unheard of margin of profit. The machine itself may have reflected little of Jobs’s sensibility, but the price at which Apple sold it was right in line with one of his core business philosophies, as Woz relayed in an anecdote from the earliest days of their friendship:

“Steve had worked in surplus electronics and said if you can buy a part for 30 cents and sell it to this guy at the surplus store for $6, you don’t have to tell him what you paid for it. It’s worth $6 to the guy. And that was his philosophy of running a business.”

There are a couple of ways to look at the outsize disparity between the IIe’s production cost and its selling price. One might see it as borderline unethical, a cynical fleecing of consumers who didn’t know any better and a marked contrast to Commodore’s drive to open up computing for everyone by selling machines that were in some ways inferior but in some ways better for less and less money. On the other hand, all of that seemingly unearned windfall didn’t disappear into thin air. Most of it was rather plowed back into Apple’s ambitious research-and-development projects that finally did change the world. (Not alone, and not to the degree Apple’s own rhetoric tried to advertise, but credit where it’s due.) Jack Tramiel, a man who saw computers as essentially elaborate calculators to be sold on the basis of price and bullet lists of technical specifications, would have been as incapable of articulating such a vision as he would have been of conceiving it in the first place. If nothing else, the IIe shows how good it is to have a cozy relationship with the press and an excellent public image, things Apple mostly enjoyed even in its worst years and Commodore rarely did even in its best. They make many people willing to pay a premium for your stuff and not ask too many questions in the process.

The Apple IIe sold like crazy, a cash cow of staggering proportions. By Christmas of 1983 IIe sales were already approaching the half-million mark. Sales then doubled for 1984, its best year; in that year alone they approached one million units. It would remain in production (with various minor variations) until November of 1993, the most long-lived single product in Apple’s corporate history. During much of that period it continued to largely sustain Apple as they struggled to get Lisa and later, after abandoning that as a lost cause, Macintosh off the ground.

In more immediate terms, the arrival of the Apple IIe also allowed the fruition of a trend in games begun by the Commodore 64. By the end of 1983, with not only the 64 and the IIe but also new Atari models introduced with 64 K of memory, that figure was already becoming a baseline expectation. A jump from 48 K to 64 K may seem a relatively modest one, but it allowed for that much more richness, that much more complexity, and went a long way toward enabling more ambitious concepts that began to emerge as the early 1980s turned into the mid-1980s.

(Unlike the relatively under-served Commodore, there is a wealth of published material on the history of Apple and its internal confusion during this period. Two of the best are West of Eden by Frank Rose and Apple Confidential by Owen Linzmayer. Among other merits, both give an unvarnished picture of what an absolute pain in the ass the young Steve Jobs could be.)

 
12 Comments

Posted by on December 30, 2012 in Digital Antiquaria, Interactive Fiction

 

Tags:

Business is War

In the 64 Commodore had their potentially world-beating home computer. Now they needed to sell it. Fortunately, Jack Tramiel still had to hand Kit Spencer, the British mastermind behind the PET’s success in his home country and the VIC-20’s in the United States. The pitchman for the latter campaign, William Shatner, was no longer an option to help sell the 64. His contract had run out just as the new machine was released, and his asking price for another go-round had increased beyond what Tramiel was willing to pay in the wake of his hit television series T.J. Hooker and the movie Star Trek II: The Wrath of Khan. Spencer decided to forgo a pitchman entirely in favor of a more direct approach that would hammer on the competition while endlessly repeating those two all-important numbers: 64 K and less than $600. He queued up a major advertising blitz in both print and television for the 1982 Christmas season, the second and final time in their history that Commodore would mount such a concentrated, smart promotional effort.

<>


Effective as it was, the campaign had none of the creativity or easy grace of the best advertising from Apple or IBM. The ads simply regurgitated those two critical numbers over and over in a somewhat numbing fashion, while comparing them with the memory size and price of one or more unfortunate competitors. Surprisingly, there was little mention of the unique graphics and sound capabilities that in the long run would define the Commodore 64 as a platform. It almost seems as if Commodore themselves did not entirely understand the capabilities of the chips that Al Charpentier and Bob Yannes had created for them. Still, Spencer showed no fear of punching above his weight. In addition to the 64’s obvious competitors in the low-end market, he happily went after much more expensive, more business-oriented machines like the Apple II and the IBM PC. Indeed, here those two critical numbers, at least when taken without any further context, favored the 64 even more markedly. The computer industry had never before seen advertising this nakedly aggressive, this determined to name names and call out the competition on their (alleged) failings. It would win Commodore few friends inside the industry. But Tramiel didn’t care; the ads were effective, and that was the important thing.

Commodore took their shots at the likes of Apple and IBM, but the real goal had become ownership of the rapidly emerging low-end — read, “home computer” — market. Tramiel’s competition there were the game consoles and the two other computer makers making a serious mass-market play for the same consumers, Atari and Texas Instruments. For the lower end of the low end, Commodore had the VIC-20; for the higher end, the 64.

Atari 5200

Atari 5200

Atari’s big new product for Christmas 1982 was the 5200, a new console based on the same chipset as their computer designs. (Those chips had originally been designed for a successor to the VCS, but rerouted into full-fledged computers when sales of the current VCS just kept increasing. Thus the process finally came full circle, albeit three years later than expected.) The 5200 was something of a stopgap, a rather panicked product from a company whose management had long since lost interest in engineering innovations. It actually marked Atari’s first major new hardware release in three years. Research and development, you see, had shrunk to virtually nil under the stewardship of CEO Ray Kassar, a former titan of the textile industry who held videogames and his customers in something perilously close to contempt. Despite being based on the same hardware, the 5200 was inexplicably incompatible with cartridges for the existing Atari home computers. Those games that were available at launch were underwhelming, and the 5200 was a major disappointment. Only the VCS — now retroactively renamed the 2600 to account for the new 5200 — continued to sell in good quantities, and those were shrinking steadily. Aside from the 2600 and 5200, Atari had only its two three-year-old computers, the chintzy, little-loved 400 and the impressive but also more expensive 800 with only 48 K of memory. With the latter selling for upwards of $600 and both machines halfheartedly (at best) promoted, the big battles of the conflict that the press would soon dub the “Home Computer Wars” would be fought between TI and Commodore. It would be a disappointing Christmas for Atari, and one which foretold bigger problems soon to come.

Put more personally — and very personal it would quickly become — the Home Computer Wars would be fought between Jack Tramiel and the youthful head of TI’s consumer-products division, William J. Turner. The opening salvo was unleashed shortly before the 64’s introduction by, surprisingly, TI rather than Commodore. At that time the TI-99/4A was selling for about $300, the VIC-20 for $240 to $250. In a move they would eventually come to regret, TI suddenly announced a $100 rebate on the TI-99/4A, bringing the final price of the machine to considerably less than that of the inferior VIC-20. With TI having provided him his Pearl Harbor, Jack Tramiel went to war. On the very same day that Turner had opened hostilities, Tramiel slashed the wholesale price of the VIC-20, bringing the typical retail price down into the neighborhood of $175. Despite this move, consumers chose the TI-99/4A by about a three to one margin that Christmas, obviously judging its superior hardware worth an extra $25 and the delayed gratification of waiting for a rebate check. Some fun advertising featuring Bill Cosby didn’t hurt a bit either, while Commodore’s own contract with William Shatner was now history, leaving little advertising presence for the VIC-20 to complement the big push Spencer was making with the 64. TI sold more than half a million computers in just a few months. Round One: TI.

Of course, the 64 did very well as well, although at almost $600 it sold in nowhere near the quantities it eventually would. In those days, computers were sold through two channels. One was the network of dedicated dealers who had helped to build the industry from the beginning, a group that included chains like Computerland and MicroAge as well as plenty of independent shops. A more recent outlet were the so-called mass merchandisers — discounters like K-Mart and Toys ‘R’ Us that lived by stacking ’em deep and selling ’em cheap, with none of the knowledge and support to be found at the dealers. Commodore and TI had been the first to begin selling their computers through mass merchandisers. Here Tramiel and Turner shared the same vision, seeing these low-end computers as consumer electronics rather than tools for hobbyists or businessmen — a marked departure from the attitude of, say, Apple. It really wasn’t possible for a computer to be successful in both distribution models. As soon as it was released to the merchandisers, the game was up for the dealers, as customers would happily come to them to get all of their questions answered, then go make the actual purchase at the big, splashy store around the corner. Commodore’s dealers had had a hard time of it for years, suffering through the limited success of the PET line in the American market only to see Commodore pass its first major sales success there, the VIC-20, to the mass merchandisers. They were understandably desperate to have the 64. Cheap as it was for its capabilities, it still represented much more of an investment than the VIC-20. Surely buyers would want to take advantage of the expertise of a real dealer. Tramiel agreed, or at least claimed to. But then, just as the Christmas season was closing, he suddenly started shipping the 64 to the mass merchandisers as well. Dealers wondering what had happened were left with only the parable of the scorpion and the frog for solace. What could Jack say? It was just his nature. By the following spring the street price of a Commodore 64 had dropped below $400, and it could be found on the shelves of every K-Mart and Toys ‘R’ Us in the country.

With the Commodore 64 joining the VIC-20 in the trenches, Christmas 1982 was looking like only the opening skirmish. 1983 was the year when the Home Computer Wars would peak. This was also the year of the Great Videogame Crash, when the market for Atari 2600 hardware and software went into free fall. In one year’s time Atari went from being the darling of Wall Street to a potentially deadly anchor — hemorrhaging millions of dollars and complete with a disgraced CEO under investigation for insider trading — for a Warner Communications that was suddenly desperate to get rid of it before it pulled the whole corporation down. Just as some had been predicting the previous year, home computers moved in to displace some of the vacuum left by the 2600’s sudden collapse.

Atari 1200XL

Atari 1200XL

In a desperate attempt to field a counterargument to the 64, Atari rushed into production early in 1983 their first new computer since introducing the 400 and 800 more than three years before. Thanks to a bank-switching scheme similar to that of the 64, the Atari 1200XL matched that machine’s 64 K of memory. Unfortunately, it was in almost every other respect a disaster. Atari made the 1200XL a “closed box” design, with none of the expansion possibilities that had made the 800 a favorite of hackers. They used new video hardware that was supposed to be better than the old, but instead yielded a fuzzier display on most monitors and televisions. Worst of all, the changes made to accommodate the extra memory made the new machine incompatible with a whole swathe of software written for the older machines, including many of the games that drove home-computer sales. An apocryphal story has sales of the Atari 800 dramatically increasing in the wake of the 1200XL’s release, as potential buyers who had been sitting on the fence rushed to buy the older machine out of fear it would soon be cancelled and leave them no option but the white elephant that was the 1200XL.

Whatever the truth of such stories, sales for the Atari computer line as a whole continued to lag far behind those of Commodore and TI, and far behind what would be needed to keep Atari a viable concern in this new world order. Huge as Atari (briefly) was, they had no chip-making facilities of their own. Instead, their products were full of MOS chips amongst others. Not only were both their console and computer lines built around the 6502, but MOS manufactured many of the game cartridges for the 2600 and 5200. Thus even when Commodore lost by seeing a potential customer choose an Atari over one of their own machines they still won in the sense that the Atari machine was built using some of their chips — chips for which Atari had to pay them.

Atari would largely be collateral damage in the Home Computer Wars. As I remarked before, however, it was personal between Tramiel and TI. You may remember that almost ten years before these events Commodore had been a thriving maker of calculators and digital watches. TI had entered those markets along with Japanese companies with devices built entirely from their own chips, which allowed them to dramatically undercut Commodore’s prices and very nearly force them out of business. Only the acquisition of MOS Technologies and the PET had saved Commodore. Now Tramiel, who never forgot a slight much less a full-on assault, could smell payback. Thanks to MOS, Commodore were now also able to make for themselves virtually all of the chips found in the VIC-20 and the 64, with the exception only of the memory chips. TI’s recent actions would seem to indicate that they thought they could drive Commodore out of the computer market just as they had driven them out of the watch and calculator markets. But this time, with both companies almost fully vertically integrated, things would be different. Bill Turner’s colossal mistake was to build his promotional campaign for the TI-99/4A entirely around price, failing to note that it was not just much cheaper than the 64 but also much more capable than the VIC-20. As it was, no matter how low Turner went, Tramiel could always go lower, because the VIC-20 was a much simpler, cheaper design to manufacture. If the Home Computer Wars were going to be all about the price tag, Turner was destined to lose.

The TI-99/4A also had another huge weakness, one ironically connected with what TI touted as its biggest strength outside of its price: its reliance on “Solid State Software,” or cartridges. Producing cartridges for sale required vastly more resources than did distributing software on cassettes or floppy disks, and at any rate TI was determined to strangle any nascent independent software market for their machine in favor of cornering this lucrative revenue stream for their own line of cartridges. They closely guarded the secrets of the machine’s design, and threatened any third-party developers who managed to write something for the platform with law suits if they failed to go through TI’s own licensing program. Those who entered said program would be rewarded with a handsome 10 percent of their software’s profits. Thus the TI-99/4A lacked the variety of software — by which I mainly mean games, the guilty pleasure that really drove the home-computer market — that existed for the VIC-20 and, soon, the 64. Although this wasn’t such an obvious concern for ordinary consumers, the TI-99/4A was thus also largely bereft of the do-it-yourself hacker spirit that marked most of the early computing platforms. (Radio Shack was already paying similarly dearly for policies on their TRS-80 line that were nowhere near as draconian as those of TI.) This meant far less innovation, far less interesting stuff to do with the TI-99/4A.

Early in 1983, Commodore slashed the wholesale price of the VIC-20 yet again; soon it was available for $139 at K-Mart. TI’s cuts in response brought the street price of the TI-99/4A down to about $150. But now they found to their horror that the tables were turned. TI now sat at the break-even point, yet Commodore was able to cut the price of the VIC-20 yet further, while also pummeling them from above with the powerful 64, whose price was plunging even more quickly than that of the VIC-20. TI was reduced to using the TI-99/4A as a loss leader. They would just break even on the computer, but would hopefully make their profits on the cartridges they also sold for it. That can be a good strategy in the right situation; for instance, in our own time it’s helped Amazon remake the face of publishing in a matter of a few years with their Kindle e-readers. But it’s dependent on having stuff that people want to buy from you after you sell them the loss leader. TI did not; the software they had to sell was mostly unimpressive in both quality and variety compared to that available for the developer-friendly Commodore machines. And the price of those Commodore machines just kept dropping, putting TI deeper and deeper into a hole as they kept struggling to match. Soon just breaking even on each TI-99/4A was only a beautiful memory.

By September the price of a 64 at a big-box discount store was less than $200, the VIC-20 about $80. Bill Turner had already been let go in disgrace. Now a desperate TI was selling the TI-99/4A at far below their own cost to make them, even as Commodore was continuing to make a modest profit on every unit sold thanks to continuous efforts to reduce production costs. At last, on October 28, 1983, TI announced that it was pulling out of the PC market altogether, having lost a stunning half a billion dollars on the venture to that point in 1983 and gutted their share prices. The TI-99/4A had gone from world beater to fiasco in barely nine months; Turner from visionary to scapegoat in less. As a parting shot, TI dumped the rest of their huge unsold inventory of TI-99/4As onto the market, where at a street price of $50 or less they managed to cause a final bit of chaos for everyone left competing in the space.

But this Kamikaze measure was the worst they could do. Jack Tramiel had his revenge. He had beaten Bill Turner, paid him back with interest for 1982. More importantly, he had beaten his old nemesis TI, delivering an embarrassment and a financial ache from which it would take them a long time to recover. With the battlefield all but cleared, 1983 turned into the Christmas of the Commodore 64. By year’s end sales were ticking merrily past the 2-million-unit mark. Even with all the discounting, North American sales revenue on Commodore’s hardware for 1983 more than doubled from that of 1982. A few non-contenders like the Coleco Adam and second-stringers like Atari’s persistent computer line aside, the Home Computer Wars were over. When their MOS chip-making division and their worldwide sales were taken into account, Commodore was now bigger than Apple, bigger than anyone left standing in the PC market with the exception only of IBM and Radio Shack, both of whose PC divisions accounted for only a small part of their total revenue. The 64 had also surpassed the Apple II as simply the computer to own if you really liked games, while also filling the gap left by the imploded Atari VCS market and, increasingly as the price dropped, the low-end home-computer market previously owned by the VIC-20 and TI-99/4A. Thanks to the Commodore 64, computer games were going big time. Love the platform and its parent company or hate them (and plenty did the latter, not least due to Tramiel’s instinct for the double cross that showed itself in more anecdotes than I can possibly relate on this blog), everybody in entertainment software had to reckon with them. Thanks largely to Commodore and TI’s price war, computer use exploded in the United States between 1982 and 1984. In late 1982, Compute!, a magazine pitched to the ordinary consumer with a low-cost home computer, had a monthly circulation of 100,000. Eighteen months later it was over 500,000. The idea of 500,000 people who not only owned PCs but were serious enough about them to buy a magazine dedicated to the subject would have sounded absurd at the time that the Commodore 64 was launched. And Compute! was just one piece of an exploding ecosystem.

Yet even at this, the supreme pinnacle of Tramiel’s long career in business, there was a whiff of the Pyrrhic in the air as the battlefield cleared. The 64 had barely made it out the door before five of its six principal engineers, the men who had put together such a brilliant little machine on such a shoestring, left Commodore. Among them were both Al Charpentier, designer of its VIC-II graphics chip, and Bob Yannes, designer of its SID sound chip. The problems had begun when Tramiel refused to pay the team the bonuses they had expected upon completing the 64; his justification was that putting the machine together had taken them six months rather than the requested three. They got worse when Tramiel refused to let them start working on a higher-end follow-up to the 64 that would offer 80-column text, a better disk system and a better BASIC, and could directly challenge the likes of the Apple II and IBM PC. And they reached a breaking point when Tramiel decided not to give them pay raises when review time came, even though some of the junior engineers, like Yannes, were barely making a subsistence living.

The five engineers left to start a company of their own. For a first project, they contracted with Atari to produce My First Computer, a product which would, via a membrane keyboard and a BASIC implementation on cartridge, turn the aged VCS into a real, if extremely limited, computer for children to learn with. Tramiel, who wielded lawyers like cudgels and seemed to regard his employees as indentured servants at best, buried the fledgling start-up in lawsuits. By the time they managed to dig themselves out, the VCS was a distant memory. Perhaps for the best in the long run: three of the engineers, including Charpentier and Yannes, formed Ensoniq to pursue Yannes’s love of electronic music. They established a stellar reputation for their synthesizers and samplers and eventually for a line of sound cards for computers which were for years the choice of the discriminating audiophile. Commodore, meanwhile, was left wondering just who was going to craft the follow-up to the 64, just as they had wondered how they would replace Chuck Peddle after Tramiel drove him away in a similar hail of legal action.

Tramiel also inexplicably soured on Kit Spencer, mastermind of the both the VIC-20 and the 64’s public roll-out, although he only sidelined him into all but meaningless bureaucratic roles rather than fire and/or sue him. Commodore’s advertising would never again be remotely as effective as it had been during the Spencer era. And in a move that attracted little notice at the time, Tramiel cut ties with Commodore’s few remaining dealers in late 1983. From now on the company would live or die with the mass merchandisers. For better or worse, Commodore was, at least in North America, now every bit a mass-market consumer-electronics company. The name “Commodore Business Machines” was truly a misnomer now, as the remnants of the business-oriented line that had begun with the original PET were left to languish and die. In later years, when they tried to build a proper support network for a more expensive machine called the Amiga, their actions of 1982 and 1983 would come back to haunt them. Few dealers would have any desire to get in bed with them again.

In January of 1984 things would get even stranger for this company that never could seem to win for long before a sort of institutionalized entropy pulled them sideways again. But we’ll save that story for later. Next time we’ll look at what Apple was doing in the midst of all this chaos.

(I highly recommend Joseph Nocera’s article in the April 1984 Texas Monthly for a look at the Home Computer Wars from the losers’ perspective.)

 
17 Comments

Posted by on December 20, 2012 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,