RSS

Tag Archives: apple

The Deal of the Century (or, The Alliance of Losers)

+

= ?

I think [the] Macintosh accomplished everything we set out to do and more, even though it reaches most people these days as Windows.

— Andy Hertzfeld (original Apple Macintosh systems programmer), 1994

When rumors first began to circulate early in 1991 that IBM and Apple were involved in high-level talks about a major joint initiative, most people dismissed them outright. It was, after all, hard to imagine two companies in the same industry with more diametrically opposed corporate cultures. IBM was Big Blue, a bedrock of American business since the 1920s. Conservative and pragmatic to a fault, it was a Brylcreemed bastion of tradition where casual days meant that employees might remove their jackets to reveal the starched white shirts they wore underneath. Apple, on the other hand, had been founded just fifteen years before by two long-haired children of the counterculture, and its campus still looked more like Woodstock than Wall Street. IBM placed great stock in the character of its workforce; Apple, as journalist Michael S. Malone would later put it in his delightfully arch book Infinite Loop, “seemed to have no character, but only an attitude, a style, a collection of mannerisms.” IBM talked about enterprise integration and system interoperability; Apple prattled on endlessly about changing the world. IBM played Lawrence Welk at corporate get-togethers; Apple preferred the Beatles. (It was an open secret that the name the company shared with the Beatles’ old record label wasn’t coincidental.)

Unsurprisingly, the two companies didn’t like each other very much. Apple in particular had been self-consciously defining itself for years as the sworn enemy of IBM and everything it represented. When Apple had greeted the belated arrival of the IBM PC in 1981 with a full-page magazine advertisement bidding Big Blue “welcome, seriously,” it had been hard to read as anything other than snarky sarcasm. And then, and most famously, had come the “1984” television advertisement to mark the debut of the Macintosh, in which Apple was personified as a hammer-throwing freedom fighter toppling a totalitarian corporate titan — Big Blue recast as Big Brother. What would the rumor-mongers be saying next? That cats would lie down with dogs? That the Russians would tell the Americans they’d given up on the whole communism thing and would like to be friends… oh, wait. It was a strange moment in history. Why not this too, then?

Indeed, when one looked a little harder, a partnership began to make at least a certain degree of sense. Apple’s rhetoric had actually softened considerably since those heady early days of the Macintosh and the acrimonious departure of Steve Jobs which had marked their ending. In the time since, more sober minds at the company had come to realize that insulting conservative corporate customers with money to spend on Apple’s pricey hardware might be counter-productive. Most of all, though, both companies found themselves in strikingly similar binds as the 1990s got underway. After soaring to rarefied heights during the early and middle years of the previous decade, they were now being judged by an increasing number of pundits as the two biggest losers of the last few years of computing history. In the face of the juggernaut that was Microsoft Windows, that irresistible force which nothing in the world of computing could seem to defy for long, it didn’t seem totally out of line to ask whether there even was a future for IBM or Apple. Seen in this light, the pithy clichés practically wrote themselves: “the enemy of my enemy is my friend”; “any port in a storm”; etc. Other, somewhat less generous commentators just talked about an alliance of losers.

Each of the two losers had gotten to this juncture by a uniquely circuitous route.

When IBM released the IBM PC, their first mass-market microcomputer, in August of 1981, they were as surprised as anyone by the way it took off. Even as hackers dismissed it as boring and unimaginative, corporate America couldn’t get enough of the thing; a boring and unimaginative personal computer — i.e., a safe one — was exactly what they had been waiting for. IBM’s profits skyrocketed during the next several years, and the pundits lined up to praise the management of this old, enormous company for having the flexibility and wherewithal to capitalize on an emerging new market; a tap-dancing elephant became the metaphor of choice.

And yet, like so many great successes, the IBM PC bore the seeds of its downfall within it from the start. It was a simple, robust machine, easy to duplicate by plugging together readily available commodity components — a process made even easier by IBM’s commitment to scrupulously documenting every last detail of its design for all and sundry. Further, IBM had made the mistake of licensing its operating system from a small company known as Microsoft rather than buying it outright or writing one of their own, and Bill Gates, Microsoft’s Machiavellian CEO, proved more than happy to license MS-DOS to anyone else who wanted it as well. The danger signs could already be seen in 1982, when an upstart company called Compaq released a “portable” version of IBM’s computer — in those days, this meant a computer which could be packed into a single suitcase — before IBM themselves could get around to it. A more dramatic tipping point arrived in 1986, when the same company made a PC clone built around Intel’s hot new 80386 CPU before IBM managed to do so.

In 1987, IBM responded to the multiplying ranks of the clone makers by introducing the PS/2 line, which came complete with a new, proprietary bus architecture, locked up tight this time inside a cage of patents and legalese. A cynical move on the face of it, it backfired spectacularly in practice. Smelling the overweening corporate arrogance positively billowing out of the PS/2 lineup, many began to ask themselves for the first time whether the industry still needed IBM at all. And the answer they often came to was not the one IBM would have preferred. IBM’s new bus architecture slowly died on the vine, while the erstwhile clone makers put together committees to define new standards of their own which evolved the design IBM had originated in more open, commonsense ways. In short, IBM lost control of the very platform they had created. By 1990, the words “PC clone” were falling out of common usage, to be replaced by talk of the “Wintel Standard.” The new standard bearer, the closest equivalent to IBM in this new world order, was Microsoft, who continued to license MS-DOS and Windows, the software that allowed all of these machines from all of these diverse manufacturers to run the same applications, to anyone willing to pay for it. Meanwhile OS/2, IBM’s mostly-compatible alternative operating system, was struggling mightily; it would never manage to cross the hump into true mass-market acceptance.

Apple’s fall from grace had been less dizzying in some ways, but the position it had left them in was almost as frustrating.

After Steve Jobs walked away from Apple in September of 1985, leaving behind the Macintosh, his twenty-month-old dream machine, the more sober-minded caretakers who succeeded him did many of the reasonable, sober-minded things which their dogmatic predecessor had refused to allow: opening the Mac up for expansion, adding much-requested arrow keys to its keyboard, toning down the revolutionary rhetoric that spooked corporate America so badly. These things, combined with the Apple LaserWriter laser printer, Aldus PageMaker software, and the desktop-publishing niche they spawned between them, saved the odd little machine from oblivion. Yet something did seem to get lost in the process. Although the Mac remained a paragon of vision in computing in many ways — HyperCard alone proved that! — Apple’s management could sometimes seem more interested in competing head-to-head with PC clones for space on the desks of secretaries than nurturing the original dream of the Macintosh as the creative, friendly, fun personal computer for the rest of us.

In fact, this period of Apple’s history must strike anyone familiar with the company of today — or, for that matter, with the company that existed before Steve Jobs’s departure — as just plain weird. Quibbles about character versus attitude aside, Apple’s most notable strength down through the years has been a peerless sense of self, which they have used to carve out their own uniquely stylish image in the ofttimes bland world of computing. How odd, then, to see the Apple of this period almost willfully trying to become the one thing neither the zealots nor the detractors have ever seen them as: just another maker of computer hardware. They flooded the market with more models than even the most dutiful fans could keep up with, none of them evincing the flair for design that marks the Macs of earlier or later eras. Their computers’ bland cases were matched with bland names like “Performa” or “Quadra” — names which all too easily could have come out of Compaq or (gasp!) IBM rather than Apple. Even the tight coupling of hardware and software into a single integrated user experience, another staple of Apple computing before and after, threatened to disappear, as CEO John Sculley took to calling Apple a “software company” and intimated that he might be willing to license MacOS to other manufacturers in the way that Microsoft did MS-DOS and Windows. At the same time, in a bid to protect the software crown jewels, he launched a prohibitively expensive and ethically and practically ill-advised lawsuit against Microsoft for copying MacOS’s “look and feel” in Windows.

Apple’s attempts to woo corporate America by acting just as bland and conventional as everyone else bore little fruit; the Macintosh itself remained too incompatible, too expensive, and too indelibly strange to lure cautious purchasing managers into the fold. Meanwhile Apple’s prices remained too high for any but the most well-heeled private users. And so the Mac soldiered on with a 5 to 10 percent market share, buoyed by a fanatically loyal user base who still saw revolutionary potential in it, even as they complained about how many of its ideas Microsoft and others had stolen. Admittedly, their numbers were not insignificant: there were about 3 and a half million members of the Macintosh family by 1990. They were enough to keep Apple afloat and basically profitable, at least for now, but already by the early 1990s most new Macs were being sold “within the family,” as it were. The Mac became known as the platform where the visionaries tried things out; if said things proved promising, they then reached the masses in the form of Windows implementations. CD-ROM, the most exciting new technology of the early 1990s, was typical. The Mac pioneered this space; Mediagenic’s The Manhole, the very first CD-ROM entertainment product, shipped first on that platform. Yet most of the people who heard the hype and went out to buy a “multimedia PC” in the years that followed brought home a Wintel machine. The Mac was a sort of aspirational showpiece platform; in defiance of the Mac’s old “computer for the rest of us” tagline, Windows was the place where the majority of ordinary people did ordinary things.

The state of MacOS added weight to these showhorse-versus-workhorse stereotypes. Its latest incarnation, known as System 6, had fallen alarmingly behind the state of the art in computing by 1990. Once one looked beyond its famously intuitive and elegant user interface, one found that it lacked robust support for multitasking; lacked for ways to address memory beyond 8 MB; lacked the virtual memory that would allow users to open more and larger applications than the physical memory allowed; lacked the memory protection that could prevent errant applications from taking down the whole system. Having been baked into many of the operating system’s core assumptions from the start — MacOS had originally been designed to run on a machine with no hard drive and just 128 K of memory — these limitations were infuriatingly difficult to remedy after the fact. Thus Apple struggled mightily with the creation of a System 7, their attempt to do just that. When System 7 finally shipped in May of 1991, two years after Apple had initially promised it would, it still lagged behind Windows under the hood in some ways: for example, it still lacked comprehensive memory protection.

The problems which dogged the Macintosh were typical of any computing platform that attempts to survive beyond the technological era which spawned it. Keeping up with the times means hacking and kludging the original vision, as efficiency and technical elegance give way to the need just to make it work, by hook or by crook. The original Mac design team had been given the rare privilege of forgetting about backward compatibility — given permission to build something truly new and “insanely great,” as Steve Jobs had so memorably put it. That, needless to say, was no longer an option. Every decision at Apple must now be made with an eye toward all of the software that had been written for the Mac in the past seven years or so. People depended on it now, which sharply limited the ways in which it could be changed; any new idea that wasn’t compatible with what had come before was an ipso-facto nonstarter. Apple’s clever programmers doubtless could have made a faster, more stable, all-around better operating system than System 7 if they had only had free rein to do so. But that was pie-in-the-sky talk.

Yet the most pressing of all the technical problems confronting the Macintosh as it aged involved its hardware rather than its software. Back in 1984, the design team had hitched their wagon to the slickest, sexiest new CPU in the industry at the time: the Motorola 68000. And for several years, they had no cause to regret that decision. The 68000 and its successor models in the same family were wonderful little chips — elegant enough to live up to even the Macintosh ideal of elegance, an absolute joy to program. Even today, many an old-timer will happily wax rhapsodic about them if given half a chance. (Few, for the record, have similarly fond memories of Intel’s chips.)

But Motorola was both a smaller and a more diversified company than Intel, the international titan of chip-making. As time went on, they found it more and more difficult to keep up with the pace set by their rival. Lacking the same cutting-edge fabrication facilities, it was hard for them to pack as many circuits into the same amount of space. Matters began to come to a head in 1989, when Intel released the 80486, a chip for which Motorola had nothing remotely comparable. Motorola’s response finally arrived in the form of the roughly-equivalent-in-horsepower 68040 — but not until more than a year later, and even then their chip was plagued by poor heat dissipation and heavy power consumption in many scenarios. Worse, word had it that Motorola was getting ready to give up on the whole 68000 line; they simply didn’t believe they could continue to compete head-to-head with Intel in this arena. One can hardly overstate how terrifying this prospect was for Apple. An end to the 68000 line must seemingly mean the end of the Macintosh, at least as everyone knew it; MacOS, along with every application ever written for the platform, were inextricably bound to the 68000. Small wonder that John Sculley started talking about Apple as a “software company.” It looked like their hardware might be going away, whether they liked it or not.

Motorola was, however, peddling an alternative to the 68000 line, embodying one of the biggest buzzwords in computer-science circles at the time: “RISC,” short for “Reduced Instruction Set Chip.” Both the Intel x86 line and the Motorola 68000 line were what had been retroactively named “CISC,” or “Complex Instruction Set Chips”: CPUs whose set of core opcodes — i.e., the set of low-level commands by which they could be directly programmed — grew constantly bigger and more baroque over time. RISC chips, on the other hand, pared their opcodes down to the bone, to only those commands which they absolutely, positively could not exist without. This made them less pleasant for a human programmer to code for — but then, the vast majority of programmers were working by now in high-level languages rather than directly controlling the CPU in assembly language anyway. And it made programs written to run on them by any method bigger, generally speaking — but then, most people by 1990 were willing to trade a bit more memory usage for extra speed. To compensate for these disadvantages, RISC chips could be simpler in terms of circuitry than CISC chips of equivalent power, making them cheaper and easier to manufacture. They also demanded less energy and produced less heat — the computer engineer’s greatest enemy — at equivalent clock speeds. As of yet, only one RISC chip was serving as the CPU in mass-market personal computers: the ARM chip, used in the machines of the British PC maker Acorn, which weren’t even sold in the United States. Nevertheless, Motorola believed RISC’s time had come. By switching to RISC, they wouldn’t need to match Intel in terms of transistors per square millimeter to produce chips of equal or greater speed. Indeed, they’d already made a RISC CPU of their own, called the 88000, in which they were eager to interest Apple.

They found a receptive audience among Apple’s programmers and engineers, who loved Motorola’s general design aesthetic. Already by the spring of 1990, Apple had launched two separate internal projects to study the possibilities for RISC in general and the 88000 in particular. One, known as Project Jaguar, envisioned a clean break with the past, in the form of a brand new computer that would be so amazing that people would be willing to accept that none of their existing software would run on it. The other, known as Project Cognac, studied whether it might be possible to port the existing MacOS to the new architecture, and then — and this was the really tricky part — find a way to make existing applications which had been compiled for a 68000-based Mac run unchanged on the new machine.

At first, the only viable option for doing so seemed to be a sort of Frankenstein’s monster of a computer, containing both an 88000- and a 68000-series CPU. The operating system would boot and run on the 88000, but when the user started an application written for an older, 68000-based Mac, it would be automatically kicked over to the secondary CPU. Within a few years, so the thinking went, all existing users would upgrade to the newer models, all current software would get recompiled to run natively on the RISC chip, and the 68000 could go away. Still, no one was all that excited by this approach; it seemed the worst Macintosh kludge yet, the very antithesis of what the machine was supposed to be.

A eureka moment came in late 1990, with the discovery of what Cognac project leader Jack McHenry came to call the “90/10 Rule.” Running profilers on typical applications, his team found that in the case of many or most of them it was the operating system, not the application itself, that consumed 90 percent or more of the CPU cycles. This was an artifact — for once, a positive one! — of the original MacOS design, which offered programmers an unprecedentedly rich interface toolbox meant to make coding as quick and easy as possible and, just as importantly, to give all applications a uniform look and feel. Thus an application simply asked for a menu containing a list of entries; it was then the operating system that did all the work of setting it up, monitoring it, and reporting back to the application when the user chose something from it. Ditto buttons, dialog boxes, etc. Even something as CPU-intensive as video playback generally happened through the operating system’s QuickTime library rather than the application actually employing it.

All of this meant that it ought to be feasible to emulate the 68000 entirely in software. The 68000 code would necessarily run slowly and inefficiently through emulation, wiping out all of the speed advantages of the new chip and then some. Yet for many or most applications the emulator would only need to be used about 10 percent of the time. The other 90 percent of the time, when the operating system itself was doing things at native speed, would more than make up for it. In due course, applications would get recompiled and the need for 68000 emulation would largely go away. But in the meanwhile, it could provide a vital bridge between the past and the future — a next-generation Mac that wouldn’t break continuity with the old one, all with a minimum of complication, for Apple’s users and for their hardware engineers alike. By mid-1991, Project Cognac had an 88000-powered prototype that could run a RISC-based MacOS and legacy Mac applications together.

And yet this wasn’t to be the final form of the RISC-based Macintosh. For, just a few months later, Apple and IBM made an announcement that the technology press billed — sometimes sarcastically, sometimes earnestly — as the “Deal of the Century.”

Apple had first begun to talk with IBM in early 1990, when Michael Spindler, the former’s president, had first reached out to Jack Kuehler, his opposite number at IBM. It seemed that, while Apple’s technical rank and file were still greatly enamored with Motorola, upper management was less sanguine. Having been burned once with the 68000, they were uncertain about Motorola’s commitment and ability to keep evolving the 88000 over the long term.

It made a lot of sense in the abstract for any company interested in RISC technology, as Apple certainly was, to contact IBM; it was actually IBM who had invented the RISC concept back in the mid-1970s. Not all that atypically for such a huge company with so many ongoing research projects, they had employed the idea for years only in limited, mostly subsidiary usage scenarios, such as mainframe channel controllers. Now, though, they were just introducing a new line of “workstation computers” — meaning extremely high-powered desktop computers, too expensive for the consumer market — which used a RISC chip called the POWER CPU that was the heir to their many years of research in the field. Like the workstations it lay at the heart of, the chip was much too expensive and complex to become the brain of Apple’s next generation of consumer computers, but it might, thought Spindler, be something to build upon. And he knew that, with IBM’s old partnership with Microsoft slowly collapsing into bickering acrimony, Big Blue might just be looking for a new partner.

The back-channel talks were intermittent and hyper-cautious at first, but, as the year wore on and the problems both of the companies faced became more and more obvious, the discussions heated up. The first formal meeting took place in February of 1991 or shortly thereafter, at an IBM facility in Austin, Texas. The Apple people, knowing IBM’s ultra-conservative reputation and wishing to make a good impression, arrived neatly groomed and dressed in three-piece suits, only to find their opposite numbers, having acted on the same motivation, sitting there in jeans and denim shirts.

That anecdote illustrates how very much both sides wanted to make this work. And indeed, the two parties found it much easier to work together than anyone might have imagined. John Sculley, the man who really called the shots at Apple, found that he got along smashingly with Jack Kuehler, to the extent that the two were soon talking almost every day. After beginning as a fairly straightforward discussion of whether IBM might be able and willing to make a RISC chip suitable for the Macintosh, the negotiations just kept growing in scale and ambition, spurred on by both companies’ deep-seated desire to stick it to Microsoft and the Wintel hegemony in any and all possible ways. They agreed to found a joint subsidiary called Taligent, staffed initially with the people from Apple’s Project Jaguar, which would continue to develop a brand new operating system that could be licensed by any hardware maker, just like MS-DOS and Windows (and for that matter IBM’s already extant OS/2). And they would found another subsidiary called Kaleida Labs, to make a cross-platform multimedia scripting engine called ScriptX.

Still, the core of the discussions remained IBM’s POWER architecture — or rather the PowerPC, as the partners agreed to call the cost-reduced, consumer-friendly version of the chip. Apple soon pulled Motorola into these parts of the talks, thus turning a bilateral into a trilateral negotiation, and providing the name for their so-called “AIM alliance” — “AIM” for Apple, IBM, and Motorola. IBM had never made a mass-market microprocessor of their own before, noted Apple, and Motorola’s experience could serve them well, as could their chip-fabrication facilities once actual production began. The two non-Apple parties were perhaps less excited at the prospect of working together — Motorola in particular must have been smarting at the rejection of their own 88000 processor which this new plan would entail — but made nice and got along.

Jack Kuehler and John Sculley brandish what they call their “marriage certificate,” looking rather disturbingly like Neville Chamberlain declaring peace in our time. The marriage would not prove an overly long or happy one.

On October 2, 1991 — just six weeks after the first 68040-based Macintosh models had shipped — Apple and IBM made official the rumors that had been swirling around for months. At a joint press briefing held inside the Fairmont Hotel in downtown San Francisco, they trumpeted all of the initiatives I’ve just described. The Deal of the Century, they said, would usher in the next phase of personal computing. Wintel must soon give way to the superiority of a PowerPC-based computer running a Taligent operating system with ScriptX onboard. New Apple Macintosh models would also use the PowerPC, but the relationship between them and these other, Taligent-powered machines remained vague.

Indeed, it was all horribly confusing. “What Taligent is doing is not designed to replace the Macintosh,” said Sculley. “Instead we think it complements and enhances its usefulness.” But what on earth did that empty corporate speak even mean? When Apple said out of the blue that they were “not going to do to the Macintosh what we did to the Apple II” — i.e., orphan it — it rather made you suspect that that was exactly what they meant to do. And what did it all mean for IBM’s OS/2, which Big Blue had been telling a decidedly unconvinced public was also the future of personal computing for several years now? “I think the message in those agreements for the future of OS/2 is that it no longer has a future,” said one analyst. And then, what was Kaleida and this ScriptX thing supposed to actually do?

So much of the agreement seemed so hopelessly vague. Compaq’s vice president declared that Apple and IBM must be “smoking dope. There’s no way it’s going to work.” One pundit called the whole thing “a con job. There’s no software, there’s no operating system. It’s just a last gasp of extinction by the giants that can’t keep up with Intel.” Apple’s own users were baffled and consternated by this sudden alliance with the company which they had been schooled to believe was technological evil incarnate. A grim joke made the rounds: what do you get when you cross Apple and IBM? The answer: IBM.

While the journalists reported and the pundits pontificated, it was up to the technical staff at Apple, IBM, and Motorola to make PowerPC computers a reality. Like their colleagues who had negotiated the deal, they all got along surprisingly well; once one pushed past the surface stereotypes, they were all just engineers trying to do the best work possible. Apple’s management wanted the first PowerPC-based Macintosh models to ship in January of 1994, to commemorate the platform’s tenth anniversary by heralding a new technological era. The old Project Cognac team, now with the new code name of “Piltdown Man” after the famous (albeit fraudulent) “missing link” in the evolution of humanity, was responsible for making this happen. For almost a year, they worked on porting MacOS to the PowerPC, as they’d previously done to the 88000. This time, though, they had no real hardware with which to work, only specifications and software emulators. The first prototype chips finally arrived on September 3, 1992, and they redoubled their efforts, pulling many an all-nighter. Thus MacOS booted up to the desktop for the first time on a real PowerPC-based machine just in time to greet the rising sun on the morning of October 3, 1992. A new era had indeed dawned.

Their goal now was to make a PowerPC-based Macintosh work exactly like any other, only faster. MacOS wouldn’t even get a new primary version number for the first PowerPC release; this major milestone in Mac history would go under the name of System 7.1.2, a name more appropriate to a minor maintenance release. It looked so identical to what had come before that its own creators couldn’t spot the difference; they wound up lighting up a single extra pixel in the PowerPC version just so they could know which was which.

Their guiding rule of an absolutely seamless transition applied in spades to the 68000 emulation layer, duly ported from the 88000 to the PowerPC. An ordinary user should never have to think about — should not even have to know about — the emulation that was happening beneath the surface. Another watershed moment came in June of 1993, when the team brought a PowerPC prototype machine to the MacHack, a coding conference and competition. Without telling any of the attendees what was inside the machine, the team let them use it to demonstrate their boundary-pushing programs. The emulation layer performed beyond their most hopeful prognostications. It looked like the Mac’s new lease on life was all but a done deal from the engineering side of things.

But alas, the bonhomie exhibited by the partner companies’ engineers and programmers down in the trenches wasn’t so marked in their executive suites after the deal was signed. The very vagueness of so many aspects of the agreement had papered over what were in reality hugely different visions of the future. IBM, a company not usually given to revolutionary rhetoric, had taken at face value the high-flown words spoken at the announcement. They truly believed that the agreement would mark a new era for personal computing in general, with a new, better hardware architecture in the form of PowerPC and an ultra-modern operating system to run on it in the form of Taligent’s work. Meanwhile it was becoming increasingly clear that Apple’s management, who claimed to be changing the world five times before breakfast on most days, had in reality seen Taligent largely as a hedge in case their people should prove unable to create a PowerPC Macintosh that looked like a Mac, felt like a Mac, and ran vintage Mac software. As Project Piltdown Man’s work proceeded apace, Apple grew less and less enamored with those other, open-architecture ideas IBM was pushing. The Taligent people didn’t help their cause by falling headfirst into a pit of airy computer-science abstractions and staying mired there for years, all while Project Piltdown Man just kept plugging away, getting things done.

The first two and a half years of the 1990s were marred by a mild but stubborn recession in the United States, during which the PC industry had a particularly hard time of it. After the summer of 1992, however, the economy picked up steam and consumer computing eased into what would prove its longest and most sustained boom of all time, borne along on a wave of hype about CD-ROM and multimedia, along with the simple fact that personal computers in general had finally evolved to a place where they could do useful things for ordinary people in a reasonably painless way. (A bit later in the boom, of course, the World Wide Web would come along to provide the greatest impetus of all.)

And yet the position of both Apple and IBM in the PC marketplace continued to get steadily worse while the rest of their industry soared. At least 90 percent of the computers that were now being sold in such impressive numbers ran Microsoft Windows, leaving OS/2, MacOS, and a few other oddballs to divide the iconoclasts, the hackers, and the non-conformists of the world among themselves. While IBM continued to flog OS/2, more out of stubbornness than hope, Apple tried a little bit of everything to stop the slide in market share and remain relevant. Still not entirely certain whether their future lay with open architectures or their own closed, proprietary one, they started porting selected software to Windows, including most notably QuickTime, their much-admired tool for encoding and playing video. They even shipped a Mac model that could also run MS-DOS and Windows, thanks to an 80486 housed in its case alongside its 68040. And they entered into a partnership with the networking giant Novell to port MacOS itself to Intel hardware — a partnership that, like many Apple initiatives of these years, petered out without ultimately producing much of anything. Perhaps most tellingly of all, this became the only period in Apple’s history when the company felt compelled to compete solely on price. They started selling Macs in department stores for the first time, where a stream of very un-Apple-like discounts and rebates greeted prospective buyers.

While Apple thus toddled along without making much headway, IBM began to annihilate all previous conceptions of how much money a single company could possibly lose, posting oceans of red that looked more like the numbers found in macroeconomic research papers than entries in an accountant’s books. The PC marketplace was in a way one of their smaller problems. Their mainframe business, their real bread and butter since the 1950s, was cratering as customers fled to the smaller, cheaper computers that could often now do the jobs of those hulking giants just as well. In 1991, when IBM first turned the corner into loss, they did so in disconcertingly convincing fashion: they lost $2.82 billion that year. And that was only the beginning. Losses totaled $4.96 billion in 1992, followed by $8.1 billion in 1993. IBM lost more money during those three years alone than any other company in the history of the world to that point; their losses exceeded the gross domestic product of Ecuador.

The employees at both Apple and IBM paid the toll for the confusions and prevarications of these years: both companies endured rounds of major layoffs. Those at IBM marked the very first such in the long history of the company. Big Blue had for decades fostered a culture of employment for life; their motto had always been, “If you do your job, you will always have your job.” This, it was now patently obvious, was no longer the case.

The bloodletting at both companies reached their executive suites as well within a few months of one another. On April 1, 1993, John Akers, the CEO of IBM, was ousted after a seven-year tenure which one business writer called “the worst record of any chief executive in the history of IBM.” Three months later, following a terrible quarterly earnings report and a drop in share price of 58 percent in the span of six months, Michael Spindler replaced John Sculley as the CEO of Apple.

These, then, were the storm clouds under which the PowerPC architecture became a physical reality.

The first PowerPC computers to be given a public display bore an IBM rather than an Apple logo on their cases. They arrived at the Comdex trade show in November of 1993, running a port of OS/2. IBM also promised a port of AIX — their version of the Unix operating system — while Sun Microsystems announced plans to port their Unix-based Solaris operating system and, most surprisingly of all, Microsoft talked about porting over Windows NT, the more advanced, server-oriented version of their world-conquering operating environment. But, noted the journalists present, “it remains unclear whether users will be able to run Macintosh applications on IBM’s PowerPC” — a fine example of the confusing messaging the two alleged allies constantly trailed in their wake. Further, there was no word at all about the status of the Taligent operating system that was supposed to become the real PowerPC standard.

Meanwhile over at Apple, Project Piltdown Man was becoming that rarest of unicorns in tech circles: a major software-engineering project that is actually completed on schedule. The release of the first PowerPC Macs was pushed back a bit, but only to allow the factories time to build up enough inventory to meet what everyone hoped would be serious consumer demand. Thus the “Power Macs” made their public bow on March 14, 1994, at New York City’s Lincoln Center, in three different configurations clocked at speeds between 60 and 80 MHz. Unlike IBM’s machines, which were shown six months before they shipped, the Power Macs were available for anyone to buy the very next day.

The initial trio of Power Macs.

This speed test, published in MacWorld magazine, shows how all three of the Power Mac machines dramatically outperform top-of-the-line Pentium machines when running native code.

They were greeted with enormous excitement and enthusiasm by the Mac faithful, who had been waiting anxiously for a machine that could go head-to-head with computers built around Intel’s new Pentium chip, the successor to the 80486. This the Power Macs could certainly do; by some benchmarks at least, the PowerPC doubled the overall throughput of a Pentium. World domination must surely be just around the corner, right?

Predictably enough, the non-Mac-centric technology press greeted the machines’ arrival more skeptically than the hardcore Mac-heads. “I think Apple will sell [a] million units, but it’s all going to be to existing Mac users,” said one market researcher. “DOS and Windows running on Intel platforms is still going to be 85 percent of the market. [The Power Mac] doesn’t give users enough of a reason to change.” Another noted that “the Mac users that I know are not interested in using Windows, and the Windows users are not interested in using the Mac. There has to be a compelling reason [to switch].”

In the end, these more guarded predictions proved the most accurate. Apple did indeed sell an impressive spurt of Power Macs in the months that followed, but almost entirely to the faithful. One might almost say that they became a victim of Project Piltdown Man’s success: the Power Mac really did seem exactly like any other Macintosh, except that it ran faster. And even this fact could be obscured when running legacy applications under emulation, as most people were doing in the early months: despite Project Piltdown Man’s heroic efforts, applications like Excel, Word, and Photoshop actually ran slightly slower on a Power Mac than on a top-of-the-line 68040-based machine. So, while the transition to PowerPC allowed the Macintosh to persist as a viable computing platform, it ultimately did nothing to improve upon its small market share. And because the PowerPC MacOS was such a direct and literal port, it still retained all of the shortcomings of MacOS in general. It remained a pretty interface stretched over some almost laughably archaic plumbing. The new generation of Mac hardware wouldn’t receive an operating system truly, comprehensively worthy of it until OS X arrived seven long years later.

Still, these harsh realities shouldn’t be allowed to detract from how deftly Apple — and particularly the unsung coders of Project Piltdown Man — executed the transition. No one before had ever picked up a consumer-computing platform bodily and moved it to an entirely new hardware architecture at all, much less done it so transparently that many or most users never really had to think about what was happening at all. (There would be only one comparable example in computing’s future. And, incredibly, the Mac would once again be the platform in question: in 2006, Apple would move from the fading PowerPC line to Intel’s chips — if you can’t beat ’em, join ’em, right? — relying once again on a cleverly coded software emulator to see them through the period of transition. The Macintosh, it seems, has more lives than Lazarus.)

Although the briefly vaunted AIM alliance did manage to give the Macintosh a new lease on life, it succeeded in very little else. The PowerPC architecture, which had cost the alliance more than $1 billion to develop, went nowhere in its non-Mac incarnations. IBM’s own machines sold in such tiny numbers that the question of whether Apple would ever allow them to run MacOS was all but rendered moot. (For the record, though: they never did.) Sun Solaris and Microsoft Windows NT did come out in PowerPC versions, but their sales couldn’t justify their existence, and within a year or two they went away again. The bold dream of creating a new reference platform for general-purpose computing to rival Wintel never got off the ground, as it became painfully clear that said dream had been taken more to heart by IBM than by Apple. Only after the millennium would the PowerPC architecture find a measure of mass-market success outside the Mac, when it was adopted by Nintendo, Microsoft, and Sony for use in videogame consoles. In this form, then, it finally paid off for IBM; far more PowerPC-powered consoles than even Macs were sold over the lifetime of the architecture. PowerPC also eventually saw use in other specialized applications, such as satellites and planetary rovers employed by NASA.

Success, then, is always relative. But not so the complete lack thereof, as Kaleida and Taligent proved. Kaleida burned through $200 million before finally shipping its ScriptX multimedia-presentation engine years after other products, most notably Macromedia’s Director, had already sewn up that space; it was disbanded and harvested for scraps by Apple in November of 1995. Taligent burned through a staggering $400 million over the same period of time, producing only some tepid programming frameworks in lieu of the revolutionary operating system that had been promised, before being absorbed back into IBM.

There is one final fascinating footnote to this story of a Deal of the Century that turned out to be little more than a strange anecdote in computing history. In the summer of 1994, IBM, having by now stopped the worst of the bleeding, settling by now into their new life as a smaller, far less dominant company, offered to buy Apple outright for a premium of $5 over their current share price. In IBM’s view, the synergies made sense: the Power Macs were selling extremely well, which was more than could be said for IBM’s PowerPC models. Why not go all in?

Ironically, it was those same healthy sales numbers that scuppered the deal in the end. If the offer had come a year earlier, when a money-losing Apple was just firing John Sculley, they surely would have jumped at it. But now Apple was feeling their oats again, and by no means entirely without reason; sales were up more than 20 percent over the previous year, and the company was once more comfortably in the black. So, they told IBM thanks, but no thanks. The same renewed taste of success also caused them to reject serious inquiries from Philips, Sun Microsystems, and Oracle. Word had it that new CEO Michael Spindler was convinced not only that the Power Mac had saved Apple, but that it had fundamentally altered their position in the marketplace.

The following year revealed how misguided that thinking really was; the Power Mac had fixed none of Apple’s fundamental problems. That year it was Microsoft who cemented their world domination instead, with the release of Windows 95, while Apple grappled with the reality that almost all of those Power Mac sales of the previous year had been to existing members of the Macintosh family, not to the new customers they so desperately needed to attract. What happened now that everyone in the family had dutifully upgraded? The answer to that question wasn’t pretty: Apple plunged off a financial cliff as precipitous in its own way as the one which had nearly destroyed IBM a few years earlier. Now, nobody was interested in acquiring them anymore. The pundits smelled the stink of death; it’s difficult to find an article on Apple written between 1995 and 1998 which doesn’t include the adjective “beleaguered.” Why buy now when you can sift through the scraps at the bankruptcy auction in just a little while?

Apple didn’t wind up dying, of course. Instead a series of improbable events, beginning with the return of prodigal-son Steve Jobs in 1997, turned them into the richest single company in the world — yes, richer even than Microsoft. These are stories for other articles. But for now, it’s perhaps worth pausing for a moment to think about an alternate timeline where the Macintosh became an IBM product, and the Deal of the Century that got that ball rolling thus came much closer to living up to its name. Bizarre, you say? Perhaps. But no more bizarre than what really happened.

(Sources: the books Insanely Great: The Life and Times of Macintosh by Steven Levy, Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Company by Owen W. Linzmayer, Infinite Loop: How the World’s Most Insanely Great Computer Company Went Insane by Michael S. Malone, Big Blues: The Unmaking of IBM by Paul Carroll, and The PowerPC Macintosh Book by Stephan Somogyi; InfoWorld of September 24 1990, October 15 1990, December 3 1990, April 8 1991, May 13 1991, May 27 1991, July 1 1991, July 8 1991, July 15 1991, July 22 1991, August 5 1991, August 19 1991, September 23 1991, September 30 1991, October 7 1991, October 21 1991, November 4 1991, December 30 1991, January 13 1992, January 20 1992, February 3 1992, March 9 1992, March 16 1992, March 23 1992, April 27 1992, May 11 1992, May 18 1992, June 15 1992, June 29 1992, July 27 1992, August 3 1992, August 10 1992, August 17 1992, September 7 1992, September 21 1992, October 5 1992, October 12 1992, October 19 1992, December 14 1992, December 21 1992, December 28 1992, January 11 1993, February 1 1993, February 22 1993, March 8 1993, March 15 1993, April 5 1993, April 12 1993, May 17 1993, May 24 1993, May 31 1993, June 21 1993, June 28 1993, July 5 1993, July 12 1993, July 19 1993, August 2 1993, August 9 1993, August 30 1993, September 6 1993, September 27 1993, October 4 1993, October 11 1993, October 18 1993, November 1 1993, November 15 1993, November 22 1993, December 6 1993, December 13 1993, December 20 1993, January 10 1994, January 31 1994, March 7 1994, March 14 1994, March 28 1994, April 25 1994, May 2 1994, May 16 1994, June 6 1994, June 27 1994; MacWorld of September 1992, February 1993, July 1993, September 1993, October 1993, November 1993, February 1994, and May 1994; Byte of November 1984. Online sources include IBM’s own corporate-history timeline and a vintage IBM lecture on the PowerPC architecture.)

 
 

Tags: , ,

Doing Windows, Part 8: The Outsiders

Microsoft Windows 3.0’s conquest of the personal-computer marketplace was bad news for a huge swath of the industry. On the software side, companies like Lotus and WordPerfect, only recently so influential that it was difficult to imagine a world that didn’t include them, would never regain the clout they had enjoyed during the 1980s, and would gradually fade away entirely. On the hardware side, it was true that plenty of makers of commodity PC clones were happier to work with a Microsoft who believed a rising tide lifted all their boats than against an IBM that was continually trying to put them out of business. But what of Big Blue themselves, still the biggest hardware maker of all, who were accustomed to dictating the direction of the industry rather than being dictated to by any mere maker of software? And what, for that matter, of Apple? Both Apple and IBM found themselves in the unaccustomed position of being the outsiders in this new Windows era of computing. Each must come to terms with Microsoft’s newfound but overwhelming power, even as each remained determined not to give up the heritage of innovation that had gotten them this far.

Having chosen to declare war on Microsoft in 1988, Apple seemed to have a very difficult road indeed in front of them — and that was before Xerox unexpectedly reentered the picture. On December 14, 1989, the latter shocked everyone by filing a $150 million lawsuit of their own, accusing Apple of ripping off the user interface employed by the Xerox Star office system before Microsoft allegedly ripped the same thing off from Apple.

The many within the computer industry who had viewed the implications of Apple’s recent actions with such concern couldn’t help but see this latest development as the perfect comeuppance for their overweening position on “look and feel” and visual copyright. These people now piled on with glee. “Apple can’t have it both ways,” said John Shoch, a former Xerox PARC researcher, to the New York Times. “They can’t complain that Microsoft [Windows has] the look and feel of the Macintosh without acknowledging the Mac has the look and feel of the Star.” In his 1987 autobiography, John Sculley himself had written the awkward words that “the Mac, like the Lisa before it, was largely a conduit for technology” developed by Xerox. How exactly was it acceptable for Apple to become a conduit for Xerox’s technology but unacceptable for Microsoft to become a conduit for Apple’s? “Apple is running around persecuting Microsoft over things they borrowed from Xerox,” said one prominent Silicon Valley attorney. The Xerox lawsuit raised uncomfortable questions of the sort which Apple would have preferred not to deal with: questions about the nature of software as an evolutionary process — ideas building upon ideas — and what would happen to that process if everyone started suing everyone else every time somebody built a better mousetrap.

Still, before we join the contemporary commentators in their jubilation at seeing Apple hoisted with their own petard, we should consider the substance of this latest case in more detail. Doing so requires that we take a closer look at what Xerox had actually created back in the day, and take particularly careful note of which of those creations was named in their lawsuit.

Broadly speaking, Xerox created two different GUI environments in the course of their years of experimentation in this area. The first and most heralded of these was known as the Smalltalk environment, pioneered by the researcher Alan Kay in 1975 on a machine called the Xerox Alto, which had been designed at PARC and was built only in limited quantities, without ever being made available for sale through traditional commercial channels. This was the machine and the environment which Steve Jobs so famously saw on his pair of visits to PARC in December of 1979 — visits which directly inspired first the Apple Lisa and later the Macintosh.

The Smalltalk environment running on a Xerox Alto, a machine built at Xerox PARC in the mid-1970s but never commercially released. Many of the basic ideas of the GUI are here, but much remains to be developed and much is implemented only in a somewhat rudimentary way. For instance, while windows can overlap one another, windows that are obscured by other windows are never redrawn. In this way the PARC researchers neatly avoided one of the most notoriously difficult aspects of implementing a windowing system. When Apple programmer Bill Atkinson was part of the delegation who made that December 1979 visit to PARC, he thought he did see windows that continued to update even when partially obscured by other windows. He then proceeded to find a way to give the Lisa and Macintosh’s windowing engine this capability. Seldom has a misunderstanding had such a fortuitous result.

Xerox’s one belated attempt to parlay PARC’s work on the GUI into a real commercial product took the form of the Xerox Star, an integrated office-productivity system costing $16,500 per workstation upon its release in 1981. Neither Kay nor most of the other key minds behind the Alto and Smalltalk were involved in its development. Yet its GUI strikes modern eyes as far more refined than that of Smalltalk. Importantly, the metaphor of the desktop, and the soon-to-be ubiquitous idea of a skeuomorphic user interface built from stand-ins for real-world office equipment — a trash can, file folders, paper documents, etc. — were apparently the brainchildren of the product-focused Star team rather than the blue-sky researchers who worked at PARC during the 1970s.

The Xerox Star office system, which was released in 1981. This system looks much more familiar to our modern eyes than the Xerox Alto’s Smalltalk, sporting such GUI staples as menus, widgets, and icons. Yet it was still lacking in many areas compared to the GUIs that would follow. Windows were neither free-dragging nor overlapping, and its menus were one-shot commands, not drop-down lists. It most resembles VisiCorp’s Visi On among the GUIs we’ve looked at closely in this series of articles. Both products serve as a telling snapshot of the state of the art in GUIs just before Apple shook everything up with the Lisa and Macintosh.

The Star, which failed dismally due to its high price and Xerox’s lack of marketing acumen, is often reduced to little more than a footnote to the story of PARC, treated as a workmanlike translation of PARC’s grand ideas and technologies into a somewhat problematic product. Yet there’s actually an important philosophical difference between Smalltalk and the Star, born of the different engineering cultures that produced them. Smalltalk emphasized programming, to the point that the environment could literally be re-programmed on the fly as you used it. This was very much in keeping with the early ethos of home computing as well, when all machines booted into BASIC and an ability to program was considered key for every young person’s future — when every high school, it seemed, was instituting classes in BASIC or Pascal. The Star, on the other hand, was engineered to ensure that the non-technical office worker never needed to see a line of code; this machine conformed to the human rather than asking the human to conform to it. One might say that Smalltalk was intended to make the joy of computing — of using the computer as the ultimate anything machine — as accessible as possible, while the Star was intended to make you forget that you were using a computer at all.

While I certainly don’t wish to dismiss or minimize the visionary work down at PARC in the 1970s, I do believe that historians of early microcomputer GUIs have tended to somewhat over-emphasize the innovations of Smalltalk and the Alto while selling the Xerox Star’s influence rather short. Steve Jobs’s early visits to PARC are given much weight in the historical record, but it’s sometimes forgotten that anything Apple wished to copy from Smalltalk had to be done from memory; they had no regular access to the PARC technology after those visits. The Star, on the other hand, did ship as a commercial product some two years before the Lisa. Notably, the Star’s philosophy of hiding the “computery” aspects of computing from the user would turn out to be much more in line with the one that guided the Lisa and Macintosh than was Smalltalk’s approach of exposing its innards for all to see and modify. The Star was a closed black box, capable of running only the software provided for it by Xerox. Similarly, the Lisa couldn’t be programmed at all except by buying a second Lisa and chaining the two machines together, and even the Macintosh never had the reputation of being a hacker’s plaything in the way of the earlier, more hobbyist-oriented Apple II. The Lisa and Macintosh thus joined the Star in embracing a clear divide between coding professionals, who wrote the software, and end users, who bought it and used it to get stuff done. One could thus say that they resemble the Star much more than Smalltalk not only visually but philosophically.

Counter-intuitive though it is to the legend of the Macintosh being a direct descendant of the work Steve Jobs saw at PARC, Xerox sued Apple over the interface elements they had allegedly stolen from the Star rather than Smalltalk. In evaluating the merits of their claim today, I’m somewhat hamstrung by the fact that no working emulators of the original Star exist,[1]This has changed since this article was written; see Ian Crossfield’s comment below. forcing me to rely on screenshots, manuals, and contemporary articles about the system. Nevertheless, those sources are enough to identify an influence of the Star upon the Macintosh that’s every bit as clear-cut as that of the Macintosh upon Microsoft Windows. It strains the bounds of credibility to believe that the Mac team coincidentally developed a skeuomorphic interface using many of the very same metaphors — including the central metaphor of the desktop — without taking the example of the Star to heart. To this template they added much innovation, including such modern GUI staples as free-dragging and overlapping windows, drop-down menus, and draggable icons, along with staple mouse gestures like the hold-and-drag and the double-click. Nonetheless, the foundations of the Mac can be seen in the Star much more obviously than they can in Smalltalk. Crudely put, Apple copied the Star while adding a whole lot of original ideas to the mix, and then Microsoft copied Apple, adding somewhat fewer ideas of their own. The people rejoicing over the Xerox lawsuit, in other words, had this aspect of the story basically correct, even if they did have a tendency to confuse Smalltalk and the Star and misunderstand which of them Xerox was actually suing over.

MacOS started with the skeuomorphic desktop model of the Xerox Star and added it to such fundamental modern GUI concepts as pull-down menus, hold-and-drag, the double-click, and free-dragging, overlapping windows that update themselves even when partially occluded by others.

Of course, the Xerox lawsuit against Apple was legally suspect for all the same reasons as the Apple lawsuit against Microsoft. If anything, there were even more reasons to question the good faith of Xerox’s lawsuit than Apple’s. The source of Xerox’s sudden litigiousness was none other than Bill Lowe, the former IBM executive whose disastrous PS/2 brainchild had already made his attitude toward intellectual property all too clear. Lowe had made a soft landing at Xerox after leaving IBM, and was now telling the press about the “aggressive stand on copyright and patent issues” his new company would be taking from now on. It certainly sounded like he intended to weaponize the long string of innovations credited to Xerox PARC and the Star — using these ideas not to develop products, but to sue others who dared to do so. Lowe’s hoped-for endgame was weirdly similar to his misbegotten hopes for the PS/2’s Micro Channel Architecture: Xerox would eventually license the right to make GUIs and other products to companies like Apple and Microsoft, profiting off their innovations of the past without having to do much of anything in the here and now. This understandably struck many of the would-be licensees as a less than ideal outcome. That, at least, was something on which Apple, Microsoft, and just about everyone else in the computer industry could agree.

Apple’s legal team was left in one heck of an awkward fix. They would seemingly have to argue against Xerox’s broad interpretation of visual copyright while arguing for that same broad interpretation in their own lawsuit against Microsoft — and all in the same court in front of the same judge. Any victory against Xerox could lead to their own words being used against them to precipitate a loss against Microsoft, and vice versa.

It was therefore extremely fortunate for Apple that Judge Vaughn R. Walker struck down Xerox’s lawsuit almost before it had gotten started. At the time of their court filing, Xerox was already outside the statute of limitations for a copyright-infringement claim of the type that Apple had filed against Microsoft. They had thus been forced to make a claim of “unfair competition” instead — a claim which carried with it a much higher evidentiary standard. On March 24, 1990, Judge Walker tossed the Xerox lawsuit, saying it didn’t meet this standard and making the unhelpful observation to Xerox that it would have made a lot more sense as a copyright claim. Apple had dodged a bullet, and Bill Lowe would have to find some other way to make money for his new company.

With the Xerox sideshow thus dispensed with, Apple’s lawyers could turn their attention back to the main event, their case against Microsoft. The same Judge Walker who had decided in their favor against Xerox had taken over from Judge William Schwarzer in the other case as well. No longer needing to worry about protecting their flank from Xerox, Apple’s lawyers pushed for what they called “total concept” or “gestalt” look and feel as the metric for deciding whether Windows infringed upon MacOS. But on March 6, 1991, Judge Walker agreed with Microsoft’s contention that the case should be decided on a “function by function” basis instead. Microsoft began assembling reels of video demonstrating what they claimed to be pre-Macintosh examples of each one of the ten interface elements that were at issue in the case.

So, even as Windows 3.0 was conquering the world outside the courtroom, both sides remained entrenched in their positions inside it, and the case, already three years old, ground on and on through motion after counter-motion. “We’re going to trial,” insisted Edward B. Stead, Apple’s general counsel, but it wasn’t at all clear when that trial would take place. Part of the problem was the sheer pace of external events. As Windows 3.0 became the fastest-selling piece of commercial software the world had ever seen, the scale and scope of Apple’s grievances just kept growing to match. From the beginning, a key component of Microsoft’s strategy had been to gum up the works in court while Windows 3.0 became a fait accompli, the new standard in personal computing, too big for any court to dare attack. That strategy seemed to be working beautifully. Meanwhile Apple’s motions grew increasingly far-fetched, beginning to take on a distinct taint of desperation.

In May of 1991, for example, Apple’s lawyers surprised everyone with a new charge. Still looking for a way to expand the case beyond those aspects of Windows 2 and 3 which hadn’t existed in Windows 1, they now claimed that the 1985 agreement which had been so constantly troublesome to them in that respect was invalid. Microsoft had allegedly defrauded Apple by saying they wouldn’t make future versions of Windows any more similar to the Macintosh than the first was, and then going against their word. This new charge was a hopeful exercise at best, especially given that the agreement Apple claimed Microsoft had broken had been, if it ever existed, strictly a verbal one; absolutely no language to this effect was to be found in the text of the 1985 agreement. Microsoft’s lawyers, once they picked their jaws up off the floor, were left fairly spluttering with indignation. Attorney David T. McDonald labeled the argument “desperate” and “preposterous”: “We’re on the five-yard line, the goal is in sight, and Apple now shows up and says, ‘How about lacrosse instead of football?'” Thankfully, Judge Walker found Apple’s argument to be as ludicrous as McDonald did, thus sparing us all any more sports metaphors.

On April 14, 1992 — now more than four years on from Apple’s original court filing, in a computing climate transformed almost beyond recognition by the rise of Windows — Judge Walker ruled against Apple’s remaining contentions in devastating fashion. Much of the 1985 agreement was indeed invalid, he said, but not for the reason Apple had claimed. What Microsoft had licensed in that agreement were largely “generic ideas” that should never be susceptible to copyright protection in the first place. Apple was entitled to protect very specific visual elements of their displays, such as the actual icons they used, but they weren’t entitled to protect the notion of a screen with icons in the abstract, nor even that of icons representing specific real-world objects, such as a disk, a folder, or a trash can. Microsoft or anyone else could, in other words, make a GUI with a trash-can icon if they wished; they just couldn’t transplant Apple’s specific rendering of a trash can into their own work. Applying the notion of visual copyright any more broadly than this “would afford too much protection and yield too little competition,” said the judge. Apple’s slippery notion of look and feel, it appeared, was dead as a basis for copyright. After all the years of struggle and at least $10 million in attorney fees on both sides, Judge Walker ruled that Apple’s case was too weak to even present before a jury. “Through five years, there were many points where the case got continuously refined and focused and narrowed,” said a Microsoft spokesman. “Eventually, there was nothing left.”

Still, one can’t accuse Apple of giving up without a fight. They dragged the case out for almost three more years after this seemingly definitive defeat. When the Ninth Circuit Court of Appeals upheld Judge Walker’s judgment in 1994, Apple tried to take the case all the way to the Supreme Court. That august body announced that they would not hear it on February 21, 1995, thus finally putting an end to the whole tortuous odyssey.

The same press which had been so consumed by the case circa 1988 barely noticed its later developments. The narrative of Microsoft’s utter dominance and Apple’s weakness had become so prevalent by the early 1990s that it had become difficult to imagine any outcome other than a Microsoft victory. Yet the case’s anticlimactic ending obscured how dangerous it had once been, not only for Microsoft but for the software industry as a whole. Whatever one thinks in general of the products and business practices of the opposing sides, a victory for Apple would have been a terrible result for the personal-computer industry. The court got this one right in striking all of Apple’s claims down so thoroughly — something that can’t always be said about collisions between technology and the law. Bill Gates could walk away knowing the long struggle had struck an important blow for an ongoing culture of innovation in the software industry. Indeed, like the victory of his hero Henry Ford over a group of automotive patent trolls eighty years before, his victory would benefit his whole industry along with his company — which isn’t to say, of course, that he would have fought the war purely for the sake of altruism.

John Sculley, for his part, was gone from Apple well before the misguided lawsuit he had fostered came to its final conclusion. He was ousted by his board of directors in 1993, after it became clear that Apple would post a loss of close to $200 million for the year. Yet his departure brought no relief to the problems of dwindling market share, dwindling focus, and, most worrisome of all, a dwindling sense of identity. Apple languished, embittered about the ideas Microsoft had “stolen” from them, while Windows conquered the world. One could certainly argue that they deserved a better fate on the basis of a Macintosh GUI that still felt far slicker and more intuitive than Microsoft’s, but the reality was that their own poor decisions, just as much as Microsoft’s ruthlessness, had led them to this sorry place. The mid-1990s saw them mired in the greatest crisis of confidence of their history, licensing the precious Macintosh technology to clone makers and seriously considering breaking themselves up into two companies to appease their angriest shareholder contingents. For several years to come, there would be a real question of whether any part of the company would survive to see the new millennium. Gone were the Jobsian dreams of changing the world through better computing; Apple was reduced to living on Microsoft’s scraps. Microsoft had won in the marketplace as thoroughly as they had in court.

But the full story of Apple’s 1990s travails is one to take up at another time. Now, we should turn to IBM, to see how they coped after the MS-DOS-based Windows, rather than the OS/2-based Presentation Manager, made the world safe for the GUI.

Throughout 1990, that year of wall-to-wall hype over Windows 3.0, Microsoft persisted in dampening expectations for OS/2 in a way that struck IBM as deliberate. The agreement that MS-DOS and Windows were for low-end computers, OS/2 and the Presentation Manager for high-end ones, seemed to have been forgotten by Microsoft as soon as Bill Gates and Steve Ballmer left the Fall 1989 Comdex at which it had been announced. Gates now said that it could take OS/2 another three or four years to inherit the throne from MS-DOS, and by that time it would probably be running Windows rather than Presentation Manager anyway. Ballmer said that OS/2 was really meant to compete with high-end client/server operating systems like Unix, not with desktop operating systems like MS-DOS. They both said that “there will be a DOS 5, 6, and 7, and a Windows 4 and 5.” Meanwhile IBM was predictably incensed by Windows 3.0’s use of protected mode and the associated shattering of the 640 K barrier; that sort of thing was supposed to have been the purview of the more advanced OS/2.

Back in late 1988, Microsoft had hired a system-software architect from DEC named David Cutler to oversee the development of OS/2 2.0. No shrinking violet, he promptly threw out virtually all of the existing OS/2 code, which he pronounced a bloated mess, and started over from scratch on an operating system that would fulfill Microsoft’s original vision for OS/2, being targeted at machines with an 80386 or better processor. The scope and ambition of this project, along with the fact that Microsoft wished to keep it entirely in-house, had turned into yet one more source of tension between the two companies; it could be years still before Cutler’s OS/2 2.0 was ready. There remained little semblance of any coordinated strategy between the two companies, in public or in private.

And yet, in September of 1990, IBM and Microsoft announced a new roadmap for OS/2’s future. The two companies together would finish up one more version of the first-generation OS/2 — OS/2 1.3, which was scheduled to ship the following month — and that would be the end of that lineage. Then IBM would develop an OS/2 2.0 alone — a project they hoped to have done in a year or so — while Cutler’s team at Microsoft continued with the complete rewrite that was now to be marketed as OS/2 3.0.

The announcement, whose substance amounted to a tacit acknowledgement that the two companies simply couldn’t work together anymore on the same project, caused heated commentary in the press. It seemed a convoluted way to evolve an operating system at best, and it was happening at the same time that Microsoft seemed to be charging ahead — and with massive commercial success at that — on MS-DOS and Windows as the long-term face of personal computing in the 1990s. InfoWorld wrote of a “deepening rift” between Microsoft and IBM, characterizing the latest agreement as IBM “seizing control of OS/2’s future.” “Although in effect IBM and Microsoft will say they won’t divorce ‘for the sake of the children,'” said an inside source to the magazine, “in fact they are already separated, and seeking new relationships.” Microsoft pushed back against the “divorce” meme only in the most tepid fashion. “You may not understand our marriage,” said Steve Ballmer, “but we’re not getting divorced.” (One might note that when a couple have to start telling friends that they aren’t getting a divorce, it usually isn’t a good sign about the state of their relationship…)

Charles Petzold, writing in PC Magazine, summed up the situation created by all the mixed messaging: “The key words in operating systems are confusion, uncertainty, anxiety, and doubt. Unfortunately, the two guiding lights of this industry — IBM and Microsoft — are part of the problem rather than part of the solution.” If anything, this view of IBM as an ongoing “guiding light” was rather charitable.  OS/2 was drowning in the Windows hype. “The success of Windows 3.0 has already caused OS/2 acceptance to go from dismal to cataclysmic,” wrote InfoWorld. “Analysts have now pushed back their estimates of when OS/2 will gain broad popularity to late this decade, with some predicting that the so-called next-generation operating system is all but dead.”

The final divorce of Microsoft from IBM came soon after to give the lie to all of the denials. In July of 1991, Microsoft announced that the erstwhile OS/2 3.0 was to become its own operating system, separate from both OS/2 and MS-DOS, called Windows NT. With this news, which barely made an impression in the press — it took up less than one quarter of page 87 of that week’s InfoWorld — a decade of cooperation came to an end. From now on, Microsoft and IBM would exist strictly as competitors in a marketplace where Microsoft enjoyed all the advantages. In the final divorce settlement, IBM gave up all rights to the upcoming Windows NT and agreed to pay a small royalty on all future sales of OS/2 (whatever those might amount to), while Microsoft paid a lump sum of around $30 million to be free and clear of their last obligations to the computing giant that had made them what they now were. They greeted this watershed moment with no sentimentality whatever. In a memo that leaked to the press, Bill Gates instead rejoiced that Microsoft was finally free of IBM’s “poor code, poor design, and other overhead.”

Even as the unlikely partnership’s decade of dominance was passing away, Microsoft’s decade of sole dominion was just beginning. The IBM PC and its clones had become the Wintel standard, and would require no further input from Big Blue, thank you very much. IBM’s share of the standard’s sales was already down to 17 percent, and would just keep on falling from there. “Microsoft is now driving the industry, not IBM,” wrote the newsletter Software Publishing by way of stating the obvious.

Which isn’t to say that IBM was going away. While Microsoft was celebrating their emancipation, IBM continued plodding forward with OS/2 2.0, which, like the aborted version 3.0 that was now to be known as Windows NT, ran only on an 80386 or better. They made a big deal of the work-in-progress at the Fall 1991 Comdex without managing to change the narrative around it one bit. The total bill for OS/2 was approaching an astonishing $1 billion, and they had very little to show for it. One Wall Street analyst pronounced OS/2 “the greatest disaster in IBM’s history. The reverberations will be felt throughout the decade.”

At the end of that year, IBM had to report — incredibly, for the very first time in their history — an annual loss. And it was no trivial loss either. The deficit was $2.8 billion, on revenues that had fallen 6.1 percent from the year before. The following year would be even worse, to the tune of a $5 billion loss. No company in the history of the world had ever lost this much money this quickly; by the last quarter of 1993, IBM would be losing $45 million every day. Microcomputers were continuing to replace the big mainframes and minicomputers that had once been the heart of IBM’s business. Now, though, fewer and fewer of those replacement machines were IBM personal computers; whole segments of their business were simply evaporating. The vague distrust IBM had evinced toward Microsoft for most of the 1980s now seemed amply justified, as all of their worst nightmares came true. IBM seemed old, bloated, and, worst of all, irrelevant next to the fresh-faced young Microsoft.

OS/2 2.0 started reaching consumers in May of 1992. It was a surprisingly impressive piece of work; perhaps the relationship with Microsoft had been as frustrating for IBM’s programmers as it had been for their counterparts. Certainly OS/2 2.0 was a far more sophisticated environment than Windows 3.0. Being designed to run only on 32-bit microprocessors like the 80386 and 80486, it utilized them to their maximum potential, which was much more than one could say for Windows, while also being much more stable than Microsoft’s notoriously crash-prone environment. In addition to native OS/2 software, it could run multiple MS-DOS applications at the same time with complete compatibility, and, in a new wrinkle added to the mix by IBM, could now run many Windows applications as well. IBM called it “a better DOS than DOS and a better Windows than Windows,” a claim which carried a considerable degree of truth. They pointedly cut its suggested list price of $140 to just $50 for Windows users looking to “upgrade.”

A Quick Tour of OS/2 2.0


Shipping on more than twenty 3.5-inch diskettes, OS/2 2.0 was by far more the most elaborate operating system yet made for its family of personal computers. When we boot it up for the first time, we’re given a lengthy interactive tutorial of a sort that was seldom seen in software of 1992 vintage.

The notion of a “Presentation Manager” GUI that’s separate from the core OS/2 operating system has been dropped; OS/2 is now simply OS/2, with a GUI as the standard, built-in interface. From the opening tutorial to the look of its desktop, the whole package reminds one of nothing of so much as the much later Windows 95. We have a full-fledged, functioning desktop workspace here, with icons representing folders and disks, and a “shredder” to replace the usual trash can.

After shipping earlier versions of OS/2 with no extra tools or applets whatsoever, IBM got wise this time around and included plenty of stuff to play with, like this neat little music editor.

Some aspects of the interface are a little strange. Dragging with the mouse is accomplished using the right button rather than the left — a fine example of OS/2’s superficial similarity and granular dissimilarity to Windows, which so many users who had to move back and forth between the environments found so frustrating.

Of course, MS-DOS is still around if you need it. Unlike in OS/2 1.x, here you can have as many MS-DOS windows and applications open as you like.

But, despite its many merits, OS/2 2.0 was a lost cause from the start, at least if one’s standard for success was Windows. Windows 3.1 rolled out of Microsoft at almost the same instant, and no amount of comparisons in techie magazines pointing out the alternative operating system’s superiority could have any impact on a mass market that was now thoroughly conditioned to accept Windows as the standard. Giant IBM’s operating system had become, as the New York Times put it, “an unlikely underdog.”

In truth, the contest was so lopsided by this point as to be laughable. Microsoft, who had long-established relationships with the erstwhile clone makers — now known as makers of hardware conforming to the Wintel standard — understood early, as IBM did only much too late, that the best and perhaps only way to get your system software widely accepted was to sell it pre-installed on the computers that ran it. Thus, by the time OS/2 2.0 shipped, Windows already came pre-installed on nine out of ten personal computers on the market, thanks to a smart and well-funded “original equipment manufacturer” sales team that was overseen personally by Steve Ballmer. And thus, simply by buying a new computer, one automatically became a Windows user. Running OS/2, on the other hand, required that the purchaser of one of these machines decide to go out and buy an alternative to the perfectly good Microsoft software already on her hard drive, and then go through all the trouble of installing and configuring it. Very few people had the requisite combination of motivation and technical skill for an exercise like that.

As a final indignity, IBM themselves had to bow to customer demand and offer MS-DOS and Windows as an optional alternative to OS/2 on their own machines. People wanted the system software that they used at the office, that their friends had, that could run all of the products on the shelves of their local computer store with 100-percent fidelity (with the exception of that oddball Mac stuff off in the corner, of course). Only the gearheads were going to buy OS/2 because it was a 32-bit instead of a 16-bit operating system or because it offered preemptive instead of cooperative multitasking, and they were a tiny slice of an exploding mass market in personal computing.

That said, OS/2 did have a better fate than many another alternative operating system during this period of Windows, Windows everywhere. It stayed around for years even in the face of that juggernaut, going through two more major revisions and many minor ones, the very last coming as late as December of 2001. It remained always a well-respected operating system that just couldn’t break through Microsoft’s choke hold on mainstream computing, having to content itself with certain niches — powering automatic teller machines was a big one for a long time — where its stability and robustness served it well.

So, IBM, and Apple as well, had indeed become the outsiders of personal computing. They would retain that dubious status for the balance of the decade of the 1990s, offering alternatives to the monoculture of Windows computing that appealed only to the tech-obsessed, the idealistic, or the just plain contrarian. Even as much of what I’ve related in this article was taking place, they were being forced into one another’s arms for the sake of sheer survival. But the story of that second unlikely IBM partnership — an awkward marriage of two corporate cultures even more dissimilar than those of Microsoft and IBM — must, like so much else, be told at another time. All that’s left to tell in this series is the story of how Windows, with the last of its great rivals bested, finished the job of conquering the world.

(Sources: the books The Making of Microsoft: How Bill Gates and His Team Created the World’s Most Successful Software Company by Daniel Ichbiah and Susan L. Knepper, Hard Drive: Bill Gates and the Making of the Microsoft Empire by James Wallace and Jim Erickson, Gates: How Microsoft’s Mogul Reinvented an Industry and Made Himself the Richest Man in America by Stephen Manes and Paul Andrews, Computer Wars: The Fall of IBM and the Future of Global Technology by Charles H. Ferguson and Charles R. Morris, and Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Company by Owen W. Linzmayer; PC Week of September 24 1990 and January 15 1991; InfoWorld of September 17 1990, May 29 1991, July 29 1991, October 28 1991, and September 6 1993; New York Times of December 29 1989, March 24 1990, March 7 1991, May 24 1991, January 18 1992, August 8 1992, January 20 1993, April 19 1993, and June 2 1993; Seattle Times of June 2 1993. Finally, I owe a lot to Nathan Lineback for the histories, insights, comparisons, and images found at his wonderful online “GUI Gallery.”)

Footnotes

Footnotes
1 This has changed since this article was written; see Ian Crossfield’s comment below.
 
 

Tags: , , , , ,

Doing Windows, Part 6: Look and Feel

From left, Dan Fylstra of VisiCorp, Bill Gates of Microsoft, and Gary Kildall of Digital Research in 1984. As usual, Gates looks rumpled, high-strung, and vaguely tortured, while Kildall looks polished, relaxed, and self-assured. (Which of these men would you rather chat with at a party?) Pictures like these perhaps reveal one of the key reasons that Gates consistently won against more naturally charismatic characters like Kildall: he personally needed to win in ways that they did not.

In the interest of clarity and concision, I’ve restricted this series of articles about non-Apple GUI environments to the efforts of Microsoft and IBM, making an exception to that rule only for VisiCorp’s Visi On, the very first product of its type. But, as I have managed to acknowledge in passing, those GUIs hardly constituted the sum total of the computer industry’s efforts in this direction. Among the more impressive and prominent of what we might label the alternative MS-DOS GUIs was a product from none other than Gary Kildall and Digital Research — yes, the very folks whom Bill Gates once so slyly fleeced out of a contract to provide the operating system for the first IBM PC.

To his immense credit, Kildall didn’t let the loss of that once-in-a-lifetime opportunity get him down for very long. Digital Research accepted the new MS-DOS-dominated order with remarkable alacrity, and set about making the best of things by publishing system software, such as the multitasking Concurrent DOS, which tried to do the delicate dance of improving on MS-DOS while maintaining compatibility. In the same spirit, they made a GUI of their own, called the “Graphics Environment Manager” — GEM.

After futzing around with various approaches, the GEM team found their muse on the day in early 1984 when team-member Darrell Miller took Apple’s new Macintosh home to show his wife: “Her eyes got big and round, and she hates computers. If the Macintosh gets that kind of reaction out of her, this is powerful.” Miller is blunt about what happened next: “We copied it exactly.” When they brought their MacOS clone to the Fall 1984 Comdex, Steve Jobs expressed nothing but approbation. “You did a great job!” he said. No one from Apple seemed the slightest bit concerned at this stage about the resemblance to the Macintosh, and GEM hit store shelves the following spring as by far the most elegant and usable MS-DOS GUI yet.

A few months later, though, Apple started singing a very different tune. In the summer of 1985, they sent a legal threat to Digital Research which included a detailed list of all the ways that they believed GEM infringed on their MacOS copyrights. Having neither the stomach nor the cash for an extended court battle and fearing a preliminary injunction which might force them to withdraw GEM from the market entirely, Digital Research caved without a fight. They signed an agreement to replace the current version of GEM with a new one by November 15, doing away with such distinctive and allegedly copyright-protected Macintosh attributes as “the trash-can icon, the disk icons, and the close-window button in the upper-left-hand corner of a window.” They also agreed to an “undisclosed monetary settlement,” and to “provide programming services to Apple at a reduced rate.”

Any chance GEM might have had to break through the crowded field of MS-DOS GUIs was undone by these events. Most of the third-party developers Digital Research so desperately needed were unnerved by the episode, abandoning any plans they might have hatched to make native GEM applications. And so GEM, despite being vastly more usable than the contemporaneous Microsoft Windows even in its somewhat bowdlerized post-agreement form, would go on to become just another also-ran in the GUI race. [1]Reworked to run under a 68000 architecture, GEM would enjoy some degree of sustained success in another realm: not as an MS-DOS-hosted GUI but as the GUI hosted in the Atari ST’s ROM. In this form, it would survive well into the 1990s.

For the industry at large, the GEM smackdown was most significant as a sign of changing power structures inside Apple — changes which carried with them a new determination that others shouldn’t be allowed to rip off all of the Mac’s innovations. The former Pepsi marketing manager John Sculley was in the ascendant at Apple by the summer of 1985, Steve Jobs already being eased out the door. The former had been taught by the Cola Wars that a product’s secret formula was everything, and had to be protected at all costs. And the Macintosh’s secret formula was its beautiful interface; without it, it was just an overpriced chunk of workmanlike hardware — a bad joke when set next to a better, cheaper Motorola 68000-based computer like the new Commodore Amiga. The complaint against Digital Research was a warning shot to an industry that Sculley believed had gotten far too casual about throwing around phrases like “Mac-like.” “Apple is going after everybody,” warned one fearful software executive to the press. The relationship between Microsoft and Apple in particular was about to get a whole lot more complicated.

Said relationship had been a generally good one during the years when Steve Jobs was calling many of Apple’s shots. Jobs and Bill Gates, dramatically divergent in countless ways but equally ambitious, shared a certain esprit de corp born of having been a part of the microcomputer industry since before there was a microcomputer industry. Jobs genuinely appreciated his counterpart’s refusal to frame business computing as a zero-sum game between the Macintosh and the MS-DOS standard, even when provoked by agitprop like Apple’s famous “1984” Super Bowl advertisement. Instead Gates, contrary to his established popular reputation as the ultimate zero-sum business warrior, supported Apple’s efforts as well as IBM’s with real enthusiasm: signing up to produce Macintosh software two full years before the finished Mac was released, standing at Jobs’s side when Apple made major announcements, coming to trade shows conspicuously sporting a Macintosh tee-shirt. All indications are that the two truly liked and respected one another. For all that Apple and Microsoft through much of these two men’s long careers would be cast as the yin and yang of personal computing — two religions engaged in the most righteous of holy wars — they would have surprisingly few negative words to say about one another personally down through the years.

But when Steve Jobs decided or was forced to submit his resignation letter to Apple on September 17, 1985, trouble for Microsoft was bound to follow. John Sculley, the man now charged with cleaning up the mess Jobs had supposedly made of the Macintosh, enjoyed nothing like the same camaraderie with Bill Gates. He and his management team were openly suspicious of Microsoft, whose Windows was already circulating widely in beta form. Gates and others at Microsoft had gone on the record repeatedly saying they intended for Windows and the Macintosh to be sufficiently similar that they and other software developers would be able to port applications in short order between the two. Few prospects could have sounded less appealing to Sculley. Apple, whose products then as now enjoyed the highest profit margins in the industry thanks to their allure as computing’s hippest luxury brand, could see their whole business model undone by the appearance of cheap commodity clones that had been transformed by the addition of Windows into Mac-alikes. Of course, one look at Windows as it actually existed in 1985 could have disabused Sculley of the notion that it was likely to win any converts among people who had so much as glanced at MacOS. Still, he wasn’t happy about the idea of the Macintosh losing its status, now or in the future, as the only GUI environment that could serve as a true, comprehensive solution to all of one’s computing needs. So, within weeks of Jobs’s departure, feeling his oats after having so thoroughly cowed Digital Research, he threatened to sue Microsoft as well for copying the “look and feel” of the Macintosh in Windows.

He really ought to have thought things through a bit more before doing so. Threatening Bill Gates was always a dangerous game to play, and it was sheer folly when Gates had the upper hand, as he largely did now. Apple was at their lowest ebb of the 1980s when they tried to tell Microsoft that Windows would have to be cancelled or radically redesigned to excise any and all similarities to the Macintosh. Sales of the Mac had fallen to some 20,000 units per month, about one-fifth of Apple’s pre-launch projections for this point. The stream of early adopters with sufficient disposable income to afford the pricey gadget had ebbed away, and other potential buyers had started asking what you could really do with a Macintosh that justified paying two or three times as much for it as for an equivalent MS-DOS-based computer. Aldus PageMaker, the first desktop-publishing package for the Mac, had been released the previous summer, and would eventually go down in history as the product that, when combined with the Apple LaserWriter printer, saved the platform by providing a usage scenario that ugly old MS-DOS clearly, obviously couldn’t duplicate. But the desktop-publishing revolution would take time to show its full import. In the meantime, Apple was hard-pressed, and needed Microsoft — one of the few major publishers of business software actively supporting the Mac — far too badly to go around issuing threats to them.

Gates responded to Sculley’s threat with several of his own. If Sculley followed through with a lawsuit, Gates said, he’d stop all work at Microsoft on applications for the Macintosh and withdraw those that were already on store shelves, treating business computing henceforward as exactly the zero-sum game which he had never believed it to be in the past. This was a particularly potent threat in light of Microsoft’s new Excel spreadsheet, which had just been released to rave reviews and already looked likely to join PageMaker as the leading light among the second generation of Mac applications. In light of the machine’s marketplace travails, Apple was in no position to toss aside a sales driver like that one, the first piece of everyday Mac business software that was not just as good as but in many ways quite clearly better than equivalent offerings for MS-DOS. Yet Gates wouldn’t stop there. He would also, he said, refuse to renew Apple’s license to use Microsoft’s BASIC on their Apple II line of computers. This was a serious threat indeed, given that the aged Apple II line was the only thing keeping Apple as a whole afloat as the newer, sexier Macintosh foundered. Duly chastised, Apple backed down quickly — whereupon Gates, smelling blood in the water, pressed his advantage relentlessly, determined to see what else he could get out of finishing the fight Sculley had so foolishly begun.

One ongoing source of frustration between the two companies, dating back well into the days of Steve Jobs’s power and glory, was the version of BASIC for the Mac which Microsoft had made available for purchase on the day the machine first shipped. In the eyes of Apple and most of their customers, the mere fact of its existence on a platform that wasn’t replete with accessible programming environments was its only virtue. In practice, it didn’t work all that differently from Microsoft’s Apple II BASIC, offering almost no access to the very things which made the Macintosh the Macintosh, like menus, windows, and dialogs. A second release a year later had improved matters somewhat, but nowhere near enough in most people’s view. So, Apple had started work on a BASIC of their own, to be called simply MacBASIC, to supersede Microsoft’s. Microsoft BASIC for the Macintosh was hardly a major pillar of his company’s finances, but Bill Gates was nevertheless bothered inordinately by the prospect of it being cast aside. “Essentially, since Microsoft started their company with BASIC, they felt proprietary towards it,” speculates Andy Hertzfeld, one of the most important of the Macintosh software engineers. “They felt threatened by Apple’s BASIC, which was a considerably better implementation than theirs.” Gates said that Apple would have to kill their own version of BASIC and — just to add salt to the wound — sign over the name “MacBASIC” to Microsoft if they wished to retain the latter’s services as a Mac application developer and retain Microsoft BASIC on the Apple II.

And that wasn’t even the worst form taken by Gates’s escalation. Apple would also have to sign what amounted to a surrender document, granting Microsoft the right to create “derivative works of the visual displays generated by Apple’s Lisa and Macintosh graphic-user-interface programs.” The specific “derivative works” covered by the agreement were the user interfaces already found in Microsoft Windows for MS-DOS and five Microsoft applications for the Macintosh, including Word and Excel. The agreement provided Microsoft with nothing less than a “non-exclusive, worldwide, royalty-free, perpetual, non-transferable license to use those derivative works in present and future software programs, and to license them to and through third parties for use in their software programs.” In return, Microsoft would promise only to support Word and Excel on the Mac until October 1, 1986 — something they would certainly have done anyway. Gates was making another of those deviously brilliant tactical moves that were already establishing his reputation as the computer industry’s most infamous villain. Rather than denying that a “visual display” could fall under the domain of copyright, as many might have been tempted to do, he would rather affirm the possibility while getting Apple to grant Microsoft an explicit exception to being bound by it. Thus Apple — or, for that matter, Microsoft — could continue to sue MacOS’s — and potentially Windows’s — competitors out of existence while Windows trundled on unmolested.

Sculley called together his management team to discuss what to do about this Apple threat against Microsoft that had suddenly boomeranged into a Microsoft threat against Apple. Most at the meeting insisted that Gates had to be bluffing, that he would never cut off several extant revenue streams just to spite Apple and support this long-overdue Windows product of his which had been an industry laughingstock for so long. But Sculley wasn’t sure; he kept coming back to the fact that Microsoft could undoubtedly survive without Apple, but Apple might not be able to survive without Microsoft — at least not right now, given the Mac’s current travails. “I’m not ready to bloody the company,” he said, and signed the surrender document two days after Windows 1.01 first appeared in its boxed form at the Fall 1985 Comdex show’s Microsoft Roast. His tone toward Gates now verged on pleading: “What I’m really asking for, Bill, is a good relationship. I’m glad to give you the rights to this stuff.”

After the full scale of what John Sculley had given away to Bill Gates became clear, Apple fans started drawing pointed comparisons between Sculley and Neville Chamberlain. As it happened, Sculley’s version of “peace for our time” would last scarcely longer than Chamberlain’s. And as for Gates… well, plenty of Apple fans would indeed soon be calling him the Adolf Hitler of the computer industry in the midst of plenty of other overheated rhetoric.

Bill Gates wrote a jubilant email to eleven colleagues at Microsoft’s partner companies, saying that he had “received a release from Apple for any possible copyright, trade-secret, or patent issue relating to our products, including Windows.” The people at Apple were less jubilant. “Everyone was somewhat disgusted over [the agreement],” remembers Donn Denman, the chief programmer of Apple’s much superior but shitcanned MacBASIC. Sculley could only say to him and his colleagues that “it was the right decision for the company. It was a business decision.” They didn’t find him very convincing. The bad feelings engendered by the agreement would never entirely go away, and the relationship between Apple and Microsoft would never be quite the same again — not even when Excel became one of the prime drivers of something of a Macintosh Renaissance in the following year.

We jump forward now to March 17, 1988, by which time the industry had changed considerably. Microsoft was still entangled with IBM in the development of OS/2 and its Presentation Manager, but was also continuing to push Windows, which had come out in a substantially revised version 2 some six months earlier. The Macintosh, meanwhile, had carved out a reasonable niche for itself as a tool for publishers and creative professionals of various stripes, even as the larger world of business-focused personal computing continued to run on MS-DOS.

Sitting in his office that day, Bill Gates agreed to take a call from a prominent technology journalist, who asked him if he had a comment to make about the new lawsuit from Apple against Microsoft. “Lawsuit? What lawsuit?” Gates asked. He had just met with Sculley the day before to discuss Microsoft’s latest Mac applications. “He never mentioned it to me. Not one word,” said Gates to the reporter on the other end of the line.

Sculley, it seemed, had decided not to risk losing his nerve again. Apple had gone straight to filing their lawsuit in court, without giving Microsoft so much as a warning, much less a chance to negotiate a remedy. [2]Apple sued Hewlett-Packard at the same time, for an application called NewWave which ran on top of Windows and provided many of the Mac-like features, such as icons representing programs and disks and a desktop workspace, which Windows 2 alone still lacked. But that lawsuit would always remain a sideshow in comparison to the main event to whose fate its own must inevitably be tied. So, in the interest of that aforementioned clarity and concision, we won’t concern ourselves with it here. It appeared that the latest version of Microsoft’s GUI environment for MS-DOS, which with its free-dragging and overlapping windows hewed much closer to the Macintosh than its predecessor, had both scared and enraged Sculley to such an extent that he had judged this declaration of war to be his only option. “Windows 2 is an unconscionable ripoff of MacOS,” claimed Apple. They demanded $50,000 per infringement per unit of Windows sold — adding up to a downright laughable total of $4.5 billion by their current best estimate — and the “impoundment and destruction” of all extant or future copies of Windows. Microsoft replied that Apple had signed over the rights to the Mac’s “visual displays” for use in Windows in 1985, and, even if they hadn’t, such things weren’t really copyrightable anyway.

So, who had the right of this thing? As one might expect, the answer to that question is far more nuanced than the arguments which either side would present in court. Writing years after the lawsuit had passed into history but repeating the arguments he had once made in court, Tandy Trower, the Windows project leader at Microsoft from 1985 to 1988, stated that “the allegation clearly had no merit as I had never intended to copy the Macintosh interface, was never given any directive to do that, and never directed my team to do that. The similarities between the two products were largely due to the fact that both Windows and Macintosh had common ancestors, that being many of the earlier windowing systems, such as those like Alto and Star that were created at Xerox PARC.” This is, to put it bluntly, nonsense. To deny the massive influence of the Macintosh on Windows is well-nigh absurd — although, I should be careful to say, I have no reason to believe that Trower makes his absurd argument out of anything but ignorance here. By the time he arrived on the Windows team, Apple’s implementation of the GUI had already been so thoroughly internalized by the industry in general that the huge strides it had made over the Xerox PARC model were being forgotten, and the profoundly incorrect and unfair meme that Apple had simply copied Xerox’s work and called it a day was already taking hold.

The people at Xerox PARC had indeed originated the idea of the GUI, but, as playing with a Xerox Alto emulator will quickly reveal, hadn’t been able to take it anywhere close to the Macintosh’s place of elegant, intuitive usability. By the time the Xerox GUI made its one appearance as a commercial product, in the form of the Xerox Star office system, it had actually regressed in at least one way even as it progressed in many others: overlapping windows, which had been possible in Xerox PARC’s Smalltalk environment, were not allowed on the Star. Tellingly, the aspect of Windows 1 which attracted the most derision back in the day, and which still makes it appear so hapless today, is a similar rigid system of tiled windows. (The presence of this retrograde-seeming element was largely thanks to Scott MacGregor, who arrived at Microsoft to guide the Windows project after having been one of the major architects of the Star.) Meanwhile, as I also noted in my little tour of Windows 1 in a previous article, many of those aspects of it which do manage to feel natural and intuitive today — such as the drop-down menus — are those that work more like the Macintosh than anything developed at Xerox PARC. In light of this reality, Microsoft’s GUI would only hew closer to the Mac model as time went on, for the incontrovertible reason that the Mac model was just better for getting real stuff done in the real world.

And there are plenty of other disconcerting points of similarity between early versions of MacOS and early versions of Windows. Right from the beginning, Windows 1 shipped with a suite of applets — a calculator, a “control panel” for system settings, a text editor that went by the name of “notepad,” etc. — that were strikingly similar to those included in MacOS. Further, if other members of the Windows team itself are to be believed, Microsoft’s Neil Konzen, a programmer intimately familiar with the Macintosh, duplicated some of MacOS’s internal structures so closely as to introduce some of the same bugs. In short, to believe that the Macintosh wasn’t the most important influence on the Windows user interface by far, given not only the similarities in the finished product but the knowledge that Microsoft had been working daily with the evolving Macintosh since January of 1982, is to actively deny reality out of either ignorance or some ulterior motive.

Which isn’t to say that Microsoft’s designers had no ideas of their own. In fact, some of those ideas are still in place in current versions of Windows. To take perhaps the most immediately obvious example, Windows then and now places its drop-down menus at the top of the windows themselves, while the Macintosh has a menu bar at the top of the screen which changes to reflect the currently selected window. [3]Both Microsoft and Apple have collected reams of data which they claim prove that their approach is the best one. I do suspect, however, that the original impetus can be found in the fact that MacOS was originally a single-tasking operating system, meaning that only one menu bar would need to be available at any one time. Windows, on the other hand, was designed as a multitasking environment from the start. And Microsoft’s embrace of the two-button mouse, contrasted with Apple’s stubborn loyalty to the one-button version of same, has sparked constant debate for decades. Still, differing details like these should be seen as exactly that in light of all the larger-scale similarities.

And yet just acknowledging that Windows was, shall we say, strongly influenced by MacOS hardly got to the bottom of the 1988 case. There was still the matter of that November 1985 agreement, which Microsoft was now waving in the face of anyone in the legal or journalistic professions who would look at it. The bone of contention between the two companies here was whether the “visual displays” of Windows 2 as well as Windows 1 were covered by the agreement. Microsoft naturally contended that they were; Apple contended that the Windows 2 interface had changed so much in comparison to its predecessor — coming to resemble the Macintosh even more in the process — that it could no longer be considered one of the specific “derivative works” to which Apple had granted Microsoft a license.

We’ll return to the court’s view of this question shortly. For now, though, let’s give Apple the benefit of the doubt as we continue to explore the full ramifications of their charges against Microsoft. The fact was that if one accepted Apple’s contention that Windows 2 wasn’t covered by the agreement, the questions surrounding the case grew more rather than less momentous. Could and should one be able to copyright the “look and feel” of a user interface, as opposed to the actual code used to create it? In pressing their claim, Apple was relying on an amorphous, under-explicated area of copyright law known as “visual copyright.”

In terms of computer software, the question of the bounds of visual copyright had been most thoroughly explored in the context of videogames. Back in 1980, Midway, a major producer of standup-arcade games, had sued a much smaller company called Dirkschneider for producing a clone of their popular game Galaxian. The judge in that case ruled in favor of Midway, formulating a new legal standard called the “Ten-foot Rule”: “If a reasonable person could not, at ten feet, tell the difference between two competitive products, then there was cause to believe an infringement was occurring.” Atari, the biggest videogame producer of all, then proceeded to use this precedent to pressure dozens of companies into withdrawing their clones of Atari games — in arcades, on game consoles, and on computers — from the market.

Somewhat later, in 1985, Brøderbund Software sued Kyocera for bundling with their printers an application called Printmaster, a thinly veiled clone of Brøderbund’s own hugely popular Print Shop package for making signs, greeting cards, and banners. They won their case the following year, with Judge William H. Orrick stating that Brøderbund’s copyright did indeed cover “the overall appearance, structure, and sequence” of screens in their software, and that Kyocera had thus infringed on same. Brøderbund’s Gary Carlston called the ruling “historic”: “If we don’t have copyright protection for our products, then it is going to be significantly more difficult to maintain a competitive advantage.” Encouraged by this ruling, in 1987 a maker of telecommunications software called Digital Communications Associates sued a company called Softklone Corporation — their name certainly didn’t help their cause — for copying the status display of their terminal software, and won their case as well. The Ten-Foot Rule, it seemed, could be successfully applied to software other than games. Both of these cases were cited by Apple’s lawyers in their own suit against Microsoft.

Brøderbund’s Print Shop side-by-side with Kyocera’s Printmaster.

Yet the Ten-Foot Rule, at least when applied to general-purpose software rather than games, struck many as deeply problematic. One of the most important advantages of a GUI was the way it made diverse types of software from diverse developers work and look the same, thereby keeping the user from having to relearn how to do the same basic tasks over and over again. What sort of chaos would follow if people started suing each other willy-nilly over this much-needed uniformity? And what did the Ten-Foot Rule mean for the many GUI environments, on MS-DOS and other platforms, that looked so similar to one another and to MacOS? That, of course, was the real crux of the matter for Microsoft and Apple as they faced one another in court.

The debate over the Ten-Foot Rule  and its potential ramifications wasn’t actually a new one, having already been taken up in public by the software industry before Apple filed their lawsuit. Fully thirteen months before that momentous day, Larry Tesler, an Apple executive, clashed heatedly with Bill Gates over this very issue at a technology conference. Tesler insisted that there was no problem inherent in applying the Ten-Foot Rule to operating systems and operating environments: “When someone comes up with a very good and popular look and feel, as we’ve done with the Macintosh, then they can make that available by licensing [it] to other people.”

But Gates was having none of this:

There’s no control of look and feel. I don’t know anybody who has asserted that things like drop-down menus and dialog boxes and just those general form-type aspects are subject to this look-and-feel stuff. Certainly it’s our view that the consistency of the user interface has become a very spreading thing, and that it’s open, generic technology. All of these approaches — how you click on [menu] bars, and certainly all those user-interface techniques and windows — there’s absolutely no restriction in any way on how people use those.

He thus ironically argued against the very premise of the 1985 agreement between Apple and Microsoft — that Apple had created a “visual display” subject to copyright protection, to which they were now granting Microsoft a license for certain products. But then, Gates seldom let deeply-held philosophical beliefs interfere with his pursuit of short-term advantage. In this latest debate as well, Gates’s arguments were undoubtedly self-serving, but they were no less valid in this case for being so. The danger he could point to if this sort of thing should spread was that of every innovative new application seeking copyright protection not just for its code but for the very ideas that made it up. Because form — i.e., look and feel — ideally followed function in software engineering. What would have happened if VisiCorp had been able to copyright the look and feel of the first spreadsheet? (VisiCalc and Lotus 1-2-3 looked pretty much identical from ten feet.) If WordStar had been able to copyright the look and feel of the word processor? If, to choose a truly absurd example, the first individual to devise a command-line interface back in the mists of time had been able to copyright that? It wasn’t at all clear where the lines could be drawn once the law started down this slippery slope. If Apple owned the set of ideas and approaches that everyone now thought of as the GUI in general, where did that leave the rest of the industry?

For this reason, Apple’s lawsuit, when it came, was greeted with deep concern even by many of those who weren’t particularly friendly with Microsoft. “Although Apple has a right to protect the results of its development and marketing efforts,” said the respected Silicon Valley pundit Larry Magid, “it should not try to thwart the obvious direction of the industry.” “If Apple is trying to push this as far as they appear to be trying to push it,” said Dan Bricklin of VisiCalc fame, “this is a sad day for the software industry in America.” More surprisingly, MacOS architect Andy Hertzfeld said that “in general, it’s a horrible thing. Apple could really end up hurting itself.” Most surprisingly of all, even Steve Jobs, now running a new company called NeXT, found Apple’s arguments as dangerous as they were unconvincing: “When we were developing the Macintosh, we kept in mind a famous quote of Picasso: ‘Good artists copy, great artists steal.’ What do I think of the suit? I personally don’t understand it. Can I copyright gravity? No.”

Interestingly, the lawyers pressing the lawsuit on Apple’s behalf didn’t ask for a preliminary injunction that would have forced Microsoft to withdraw Windows from the market. Some legal watchers interpreted this fact as a sign that they themselves weren’t certain about the real merits of their case, and hoped to win it as much through bluster as finely-honed legal arguments. Ditto Apple’s request that the eventual trial be decided by a jury of ordinary people who might be prone to weigh the case based on everyday standards of “fairness,” rather than by a judge who would be well-versed in the niceties of the law and the full ramifications of a verdict against Microsoft.

At this point, and especially given those ramifications, one feels compelled to ask just why Apple chose at this juncture to embark on such a lengthy, expensive, and fraught enterprise as a lawsuit against the company that remained the most important single provider of serious business software for the Macintosh, a platform whose cup still wasn’t exactly running over with such things. By way of an answer, we should consider that John Sculley was as proud a man as most people who rise to his elevated status in business tend to be. The belief, widespread both inside and outside of Apple, that he had let Bill Gates bully, outsmart, and finally rob him blind back in 1985 had to rankle him badly. In addition, Apple in general had long nursed a grievance, unproductive but understandable, against all the outsiders who had copied the interface they had worked so long and hard to perfect; thus those threatened lawsuits against Digital Research and Microsoft all the way back in 1985. A wiser leader might have told his employees to take their competitors’ imperfect copying as proof of Apple’s superiority, might have exhorted them to look toward their next big innovation rather than litigate their innovations of the past. But, at least on March 17, 1988, John Sculley wasn’t that wiser leader. Thus this lawsuit, dangerous not just to Apple and Microsoft but to their entire industry.

Bill Gates, for his part, remained more accustomed to bullying than being bullied. It had been spelled out for him right there in the court filing that a loss to Apple would almost certainly mean the end of Windows, the operating environment which was quite possibly the key to Microsoft’s future. Even widespread fear of such an event, he realized, could be devastating to Windows’s — and thus to Microsoft’s — prospects. So, he struck back fiercely so as to leave no doubt where he stood. Microsoft filed a counter-suit in April of 1988, accusing Apple of breaking the 1985 agreement and of filing their own lawsuit in bad faith, in the hope of creating fear, uncertainty, and doubt around Windows and thus “wrongfully inhibiting” its commercial future. Adding weight to their argument that the original lawsuit was a form of business competition by other means was the fact that Apple was being oddly selective in choosing whom to sue over the alleged copyright violations. Asked why they weren’t going after other products just as similar to MacOS as Windows, such as IBM’s forthcoming OS/2 Presentation Manager, Apple refused to comment.

The first skirmishes took place in the press rather than a courtroom: Sculley accusing Gates of having tricked him into signing the 1985 agreement, Gates saying a contract was a contract, and what sort of a chief executive let himself be tricked anyway? The exchanges just kept getting uglier from there. The technology journalists, naturally, loved every minute of it, while the software industry was thrown into a tizzy, wondering what this would mean for Windows just as it finally seemed to be gaining some traction. Phillipe Kahn, CEO of Borland, described the situation in colorful if non-politically-correct language: it was like “waking up and finding your partner might have AIDS.”

The court case marched forward much more slowly than the tabloid war of words. Gates stated in a sworn deposition that “from a user’s perspective, the visual displays which appear in Windows 2 are virtually identical to those which appear in Windows 1.” “This assertion,” Apple replied, “is contradicted by even the most casual observation of the two products.” On March 18, 1989, Judge William Schwarzer of the Federal District Court in San Francisco marked the one-year anniversary of the case by ruling against Microsoft on this issue, stating that only those attributes of Windows 2 which had also existed in Windows 1 were covered by the 1985 agreement. This meant most notably that the newer GUI’s system of overlapping windows stood outside the boundaries of that document, and thus that, as the judge put it, the 1985 agreement alone “was not a complete defense” for Microsoft. It did not, he ruled, give Microsoft the right “to develop future versions of Windows as it pleases. What Microsoft received was a license to use the visual displays in the named software products as they appeared to the user in November 1985. The displays [of Windows 1 and Windows 2] are fundamentally different.” Microsoft’s stock price promptly plummeted by 27 percent. It was an undeniable setback. “Microsoft’s major defense has been shot down,” crowed Apple’s attorneys.

“Major” was perhaps not the right choice of adjectives, but certainly Microsoft’s simplest possible form of defense had proved insufficient to bail them out. It seemed that total victory could be achieved now only by invalidating the whole notion of visual copyright which underlay both the 1985 agreement and Apple’s lawsuit based on its violation. That meant a long, tough slog at best. And with Windows 3 — the version that Microsoft was convinced would finally be the breakthrough version — getting closer and closer to release and looking more and more like the Macintosh all the while, the stakes were higher than ever.

The question of look and feel and visual copyright as applied to software had implications transcending even the fate of Windows or either company. If Apple’s suit succeeded, it would transform the software business overnight, making it extremely difficult to borrow or build on the ideas of others in the way that software had always done in the past. Bill Gates was an avid student of business history. As was his wont, he now looked back to compare his current plight with that of an earlier titan of industry. Back in 1903, just as the Ford Motor Company was getting off the ground, Henry Ford had been hit with a lawsuit from a group of inventors claiming to own a patent on the very concept of the automobile. He had battled them for years, vowing to fight on even after losing in open court in 1909: “There will be no let-up in the legal fight,” he declared on that dark day. At last, in 1911, he won the case on appeal — winning it not only for Ford Motor Company but for the future of the automobile industry as a field of open competition. His own legal war had similar stakes, Gates believed, and he and Microsoft intended to prosecute it in equally stalwart fashion — to win it not just for themselves but for the future of the software industry. This was necessary, he wrote in a memo, “to help set the boundaries of where copyrights should and should not be applied. We will prevail.”

(Sources: the books The Making of Microsoft: How Bill Gates and His Team Created the World’s Most Successful Software Company by Daniel Ichbiah and Susan L. Knepper, Hard Drive: Bill Gates and the Making of the Microsoft Empire by James Wallace and Jim Erickson, Gates: How Microsoft’s Mogul Reinvented an Industry and Made Himself the Richest Man in America by Stephen Manes and Paul Andrews, and Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Company by Owen W. Linzmayer; Wall Street Journal of September 25 1987; Creative Computing of May 1985; InfoWorld of October 7 1985 and October 20 1986; MacWorld of October 1993; New York Times of March 18 1988 and March 18 1989.)

Footnotes

Footnotes
1 Reworked to run under a 68000 architecture, GEM would enjoy some degree of sustained success in another realm: not as an MS-DOS-hosted GUI but as the GUI hosted in the Atari ST’s ROM. In this form, it would survive well into the 1990s.
2 Apple sued Hewlett-Packard at the same time, for an application called NewWave which ran on top of Windows and provided many of the Mac-like features, such as icons representing programs and disks and a desktop workspace, which Windows 2 alone still lacked. But that lawsuit would always remain a sideshow in comparison to the main event to whose fate its own must inevitably be tied. So, in the interest of that aforementioned clarity and concision, we won’t concern ourselves with it here.
3 Both Microsoft and Apple have collected reams of data which they claim prove that their approach is the best one. I do suspect, however, that the original impetus can be found in the fact that MacOS was originally a single-tasking operating system, meaning that only one menu bar would need to be available at any one time. Windows, on the other hand, was designed as a multitasking environment from the start.
 

Tags: , , , , ,