RSS

Tag Archives: macintosh

The Deal of the Century (or, The Alliance of Losers)

+

= ?

I think [the] Macintosh accomplished everything we set out to do and more, even though it reaches most people these days as Windows.

— Andy Hertzfeld (original Apple Macintosh systems programmer), 1994

When rumors first began to circulate early in 1991 that IBM and Apple were involved in high-level talks about a major joint initiative, most people dismissed them outright. It was, after all, hard to imagine two companies in the same industry with more diametrically opposed corporate cultures. IBM was Big Blue, a bedrock of American business since the 1920s. Conservative and pragmatic to a fault, it was a Brylcreemed bastion of tradition where casual days meant that employees might remove their jackets to reveal the starched white shirts they wore underneath. Apple, on the other hand, had been founded just fifteen years before by two long-haired children of the counterculture, and its campus still looked more like Woodstock than Wall Street. IBM placed great stock in the character of its workforce; Apple, as journalist Michael S. Malone would later put it in his delightfully arch book Infinite Loop, “seemed to have no character, but only an attitude, a style, a collection of mannerisms.” IBM talked about enterprise integration and system interoperability; Apple prattled on endlessly about changing the world. IBM played Lawrence Welk at corporate get-togethers; Apple preferred the Beatles. (It was an open secret that the name the company shared with the Beatles’ old record label wasn’t coincidental.)

Unsurprisingly, the two companies didn’t like each other very much. Apple in particular had been self-consciously defining itself for years as the sworn enemy of IBM and everything it represented. When Apple had greeted the belated arrival of the IBM PC in 1981 with a full-page magazine advertisement bidding Big Blue “welcome, seriously,” it had been hard to read as anything other than snarky sarcasm. And then, and most famously, had come the “1984” television advertisement to mark the debut of the Macintosh, in which Apple was personified as a hammer-throwing freedom fighter toppling a totalitarian corporate titan — Big Blue recast as Big Brother. What would the rumor-mongers be saying next? That cats would lie down with dogs? That the Russians would tell the Americans they’d given up on the whole communism thing and would like to be friends… oh, wait. It was a strange moment in history. Why not this too, then?

Indeed, when one looked a little harder, a partnership began to make at least a certain degree of sense. Apple’s rhetoric had actually softened considerably since those heady early days of the Macintosh and the acrimonious departure of Steve Jobs which had marked their ending. In the time since, more sober minds at the company had come to realize that insulting conservative corporate customers with money to spend on Apple’s pricey hardware might be counter-productive. Most of all, though, both companies found themselves in strikingly similar binds as the 1990s got underway. After soaring to rarefied heights during the early and middle years of the previous decade, they were now being judged by an increasing number of pundits as the two biggest losers of the last few years of computing history. In the face of the juggernaut that was Microsoft Windows, that irresistible force which nothing in the world of computing could seem to defy for long, it didn’t seem totally out of line to ask whether there even was a future for IBM or Apple. Seen in this light, the pithy clichés practically wrote themselves: “the enemy of my enemy is my friend”; “any port in a storm”; etc. Other, somewhat less generous commentators just talked about an alliance of losers.

Each of the two losers had gotten to this juncture by a uniquely circuitous route.

When IBM released the IBM PC, their first mass-market microcomputer, in August of 1981, they were as surprised as anyone by the way it took off. Even as hackers dismissed it as boring and unimaginative, corporate America couldn’t get enough of the thing; a boring and unimaginative personal computer — i.e., a safe one — was exactly what they had been waiting for. IBM’s profits skyrocketed during the next several years, and the pundits lined up to praise the management of this old, enormous company for having the flexibility and wherewithal to capitalize on an emerging new market; a tap-dancing elephant became the metaphor of choice.

And yet, like so many great successes, the IBM PC bore the seeds of its downfall within it from the start. It was a simple, robust machine, easy to duplicate by plugging together readily available commodity components — a process made even easier by IBM’s commitment to scrupulously documenting every last detail of its design for all and sundry. Further, IBM had made the mistake of licensing its operating system from a small company known as Microsoft rather than buying it outright or writing one of their own, and Bill Gates, Microsoft’s Machiavellian CEO, proved more than happy to license MS-DOS to anyone else who wanted it as well. The danger signs could already be seen in 1982, when an upstart company called Compaq released a “portable” version of IBM’s computer — in those days, this meant a computer which could be packed into a single suitcase — before IBM themselves could get around to it. A more dramatic tipping point arrived in 1986, when the same company made a PC clone built around Intel’s hot new 80386 CPU before IBM managed to do so.

In 1987, IBM responded to the multiplying ranks of the clone makers by introducing the PS/2 line, which came complete with a new, proprietary bus architecture, locked up tight this time inside a cage of patents and legalese. A cynical move on the face of it, it backfired spectacularly in practice. Smelling the overweening corporate arrogance positively billowing out of the PS/2 lineup, many began to ask themselves for the first time whether the industry still needed IBM at all. And the answer they often came to was not the one IBM would have preferred. IBM’s new bus architecture slowly died on the vine, while the erstwhile clone makers put together committees to define new standards of their own which evolved the design IBM had originated in more open, commonsense ways. In short, IBM lost control of the very platform they had created. By 1990, the words “PC clone” were falling out of common usage, to be replaced by talk of the “Wintel Standard.” The new standard bearer, the closest equivalent to IBM in this new world order, was Microsoft, who continued to license MS-DOS and Windows, the software that allowed all of these machines from all of these diverse manufacturers to run the same applications, to anyone willing to pay for it. Meanwhile OS/2, IBM’s mostly-compatible alternative operating system, was struggling mightily; it would never manage to cross the hump into true mass-market acceptance.

Apple’s fall from grace had been less dizzying in some ways, but the position it had left them in was almost as frustrating.

After Steve Jobs walked away from Apple in September of 1985, leaving behind the Macintosh, his twenty-month-old dream machine, the more sober-minded caretakers who succeeded him did many of the reasonable, sober-minded things which their dogmatic predecessor had refused to allow: opening the Mac up for expansion, adding much-requested arrow keys to its keyboard, toning down the revolutionary rhetoric that spooked corporate America so badly. These things, combined with the Apple LaserWriter laser printer, Aldus PageMaker software, and the desktop-publishing niche they spawned between them, saved the odd little machine from oblivion. Yet something did seem to get lost in the process. Although the Mac remained a paragon of vision in computing in many ways — HyperCard alone proved that! — Apple’s management could sometimes seem more interested in competing head-to-head with PC clones for space on the desks of secretaries than nurturing the original dream of the Macintosh as the creative, friendly, fun personal computer for the rest of us.

In fact, this period of Apple’s history must strike anyone familiar with the company of today — or, for that matter, with the company that existed before Steve Jobs’s departure — as just plain weird. Quibbles about character versus attitude aside, Apple’s most notable strength down through the years has been a peerless sense of self, which they have used to carve out their own uniquely stylish image in the ofttimes bland world of computing. How odd, then, to see the Apple of this period almost willfully trying to become the one thing neither the zealots nor the detractors have ever seen them as: just another maker of computer hardware. They flooded the market with more models than even the most dutiful fans could keep up with, none of them evincing the flair for design that marks the Macs of earlier or later eras. Their computers’ bland cases were matched with bland names like “Performa” or “Quadra” — names which all too easily could have come out of Compaq or (gasp!) IBM rather than Apple. Even the tight coupling of hardware and software into a single integrated user experience, another staple of Apple computing before and after, threatened to disappear, as CEO John Sculley took to calling Apple a “software company” and intimated that he might be willing to license MacOS to other manufacturers in the way that Microsoft did MS-DOS and Windows. At the same time, in a bid to protect the software crown jewels, he launched a prohibitively expensive and ethically and practically ill-advised lawsuit against Microsoft for copying MacOS’s “look and feel” in Windows.

Apple’s attempts to woo corporate America by acting just as bland and conventional as everyone else bore little fruit; the Macintosh itself remained too incompatible, too expensive, and too indelibly strange to lure cautious purchasing managers into the fold. Meanwhile Apple’s prices remained too high for any but the most well-heeled private users. And so the Mac soldiered on with a 5 to 10 percent market share, buoyed by a fanatically loyal user base who still saw revolutionary potential in it, even as they complained about how many of its ideas Microsoft and others had stolen. Admittedly, their numbers were not insignificant: there were about 3 and a half million members of the Macintosh family by 1990. They were enough to keep Apple afloat and basically profitable, at least for now, but already by the early 1990s most new Macs were being sold “within the family,” as it were. The Mac became known as the platform where the visionaries tried things out; if said things proved promising, they then reached the masses in the form of Windows implementations. CD-ROM, the most exciting new technology of the early 1990s, was typical. The Mac pioneered this space; Mediagenic’s The Manhole, the very first CD-ROM entertainment product, shipped first on that platform. Yet most of the people who heard the hype and went out to buy a “multimedia PC” in the years that followed brought home a Wintel machine. The Mac was a sort of aspirational showpiece platform; in defiance of the Mac’s old “computer for the rest of us” tagline, Windows was the place where the majority of ordinary people did ordinary things.

The state of MacOS added weight to these showhorse-versus-workhorse stereotypes. Its latest incarnation, known as System 6, had fallen alarmingly behind the state of the art in computing by 1990. Once one looked beyond its famously intuitive and elegant user interface, one found that it lacked robust support for multitasking; lacked for ways to address memory beyond 8 MB; lacked the virtual memory that would allow users to open more and larger applications than the physical memory allowed; lacked the memory protection that could prevent errant applications from taking down the whole system. Having been baked into many of the operating system’s core assumptions from the start — MacOS had originally been designed to run on a machine with no hard drive and just 128 K of memory — these limitations were infuriatingly difficult to remedy after the fact. Thus Apple struggled mightily with the creation of a System 7, their attempt to do just that. When System 7 finally shipped in May of 1991, two years after Apple had initially promised it would, it still lagged behind Windows under the hood in some ways: for example, it still lacked comprehensive memory protection.

The problems which dogged the Macintosh were typical of any computing platform that attempts to survive beyond the technological era which spawned it. Keeping up with the times means hacking and kludging the original vision, as efficiency and technical elegance give way to the need just to make it work, by hook or by crook. The original Mac design team had been given the rare privilege of forgetting about backward compatibility — given permission to build something truly new and “insanely great,” as Steve Jobs had so memorably put it. That, needless to say, was no longer an option. Every decision at Apple must now be made with an eye toward all of the software that had been written for the Mac in the past seven years or so. People depended on it now, which sharply limited the ways in which it could be changed; any new idea that wasn’t compatible with what had come before was an ipso-facto nonstarter. Apple’s clever programmers doubtless could have made a faster, more stable, all-around better operating system than System 7 if they had only had free rein to do so. But that was pie-in-the-sky talk.

Yet the most pressing of all the technical problems confronting the Macintosh as it aged involved its hardware rather than its software. Back in 1984, the design team had hitched their wagon to the slickest, sexiest new CPU in the industry at the time: the Motorola 68000. And for several years, they had no cause to regret that decision. The 68000 and its successor models in the same family were wonderful little chips — elegant enough to live up to even the Macintosh ideal of elegance, an absolute joy to program. Even today, many an old-timer will happily wax rhapsodic about them if given half a chance. (Few, for the record, have similarly fond memories of Intel’s chips.)

But Motorola was both a smaller and a more diversified company than Intel, the international titan of chip-making. As time went on, they found it more and more difficult to keep up with the pace set by their rival. Lacking the same cutting-edge fabrication facilities, it was hard for them to pack as many circuits into the same amount of space. Matters began to come to a head in 1989, when Intel released the 80486, a chip for which Motorola had nothing remotely comparable. Motorola’s response finally arrived in the form of the roughly-equivalent-in-horsepower 68040 — but not until more than a year later, and even then their chip was plagued by poor heat dissipation and heavy power consumption in many scenarios. Worse, word had it that Motorola was getting ready to give up on the whole 68000 line; they simply didn’t believe they could continue to compete head-to-head with Intel in this arena. One can hardly overstate how terrifying this prospect was for Apple. An end to the 68000 line must seemingly mean the end of the Macintosh, at least as everyone knew it; MacOS, along with every application ever written for the platform, were inextricably bound to the 68000. Small wonder that John Sculley started talking about Apple as a “software company.” It looked like their hardware might be going away, whether they liked it or not.

Motorola was, however, peddling an alternative to the 68000 line, embodying one of the biggest buzzwords in computer-science circles at the time: “RISC,” short for “Reduced Instruction Set Chip.” Both the Intel x86 line and the Motorola 68000 line were what had been retroactively named “CISC,” or “Complex Instruction Set Chips”: CPUs whose set of core opcodes — i.e., the set of low-level commands by which they could be directly programmed — grew constantly bigger and more baroque over time. RISC chips, on the other hand, pared their opcodes down to the bone, to only those commands which they absolutely, positively could not exist without. This made them less pleasant for a human programmer to code for — but then, the vast majority of programmers were working by now in high-level languages rather than directly controlling the CPU in assembly language anyway. And it made programs written to run on them by any method bigger, generally speaking — but then, most people by 1990 were willing to trade a bit more memory usage for extra speed. To compensate for these disadvantages, RISC chips could be simpler in terms of circuitry than CISC chips of equivalent power, making them cheaper and easier to manufacture. They also demanded less energy and produced less heat — the computer engineer’s greatest enemy — at equivalent clock speeds. As of yet, only one RISC chip was serving as the CPU in mass-market personal computers: the ARM chip, used in the machines of the British PC maker Acorn, which weren’t even sold in the United States. Nevertheless, Motorola believed RISC’s time had come. By switching to RISC, they wouldn’t need to match Intel in terms of transistors per square millimeter to produce chips of equal or greater speed. Indeed, they’d already made a RISC CPU of their own, called the 88000, in which they were eager to interest Apple.

They found a receptive audience among Apple’s programmers and engineers, who loved Motorola’s general design aesthetic. Already by the spring of 1990, Apple had launched two separate internal projects to study the possibilities for RISC in general and the 88000 in particular. One, known as Project Jaguar, envisioned a clean break with the past, in the form of a brand new computer that would be so amazing that people would be willing to accept that none of their existing software would run on it. The other, known as Project Cognac, studied whether it might be possible to port the existing MacOS to the new architecture, and then — and this was the really tricky part — find a way to make existing applications which had been compiled for a 68000-based Mac run unchanged on the new machine.

At first, the only viable option for doing so seemed to be a sort of Frankenstein’s monster of a computer, containing both an 88000- and a 68000-series CPU. The operating system would boot and run on the 88000, but when the user started an application written for an older, 68000-based Mac, it would be automatically kicked over to the secondary CPU. Within a few years, so the thinking went, all existing users would upgrade to the newer models, all current software would get recompiled to run natively on the RISC chip, and the 68000 could go away. Still, no one was all that excited by this approach; it seemed the worst Macintosh kludge yet, the very antithesis of what the machine was supposed to be.

A eureka moment came in late 1990, with the discovery of what Cognac project leader Jack McHenry came to call the “90/10 Rule.” Running profilers on typical applications, his team found that in the case of many or most of them it was the operating system, not the application itself, that consumed 90 percent or more of the CPU cycles. This was an artifact — for once, a positive one! — of the original MacOS design, which offered programmers an unprecedentedly rich interface toolbox meant to make coding as quick and easy as possible and, just as importantly, to give all applications a uniform look and feel. Thus an application simply asked for a menu containing a list of entries; it was then the operating system that did all the work of setting it up, monitoring it, and reporting back to the application when the user chose something from it. Ditto buttons, dialog boxes, etc. Even something as CPU-intensive as video playback generally happened through the operating system’s QuickTime library rather than the application actually employing it.

All of this meant that it ought to be feasible to emulate the 68000 entirely in software. The 68000 code would necessarily run slowly and inefficiently through emulation, wiping out all of the speed advantages of the new chip and then some. Yet for many or most applications the emulator would only need to be used about 10 percent of the time. The other 90 percent of the time, when the operating system itself was doing things at native speed, would more than make up for it. In due course, applications would get recompiled and the need for 68000 emulation would largely go away. But in the meanwhile, it could provide a vital bridge between the past and the future — a next-generation Mac that wouldn’t break continuity with the old one, all with a minimum of complication, for Apple’s users and for their hardware engineers alike. By mid-1991, Project Cognac had an 88000-powered prototype that could run a RISC-based MacOS and legacy Mac applications together.

And yet this wasn’t to be the final form of the RISC-based Macintosh. For, just a few months later, Apple and IBM made an announcement that the technology press billed — sometimes sarcastically, sometimes earnestly — as the “Deal of the Century.”

Apple had first begun to talk with IBM in early 1990, when Michael Spindler, the former’s president, had first reached out to Jack Kuehler, his opposite number at IBM. It seemed that, while Apple’s technical rank and file were still greatly enamored with Motorola, upper management was less sanguine. Having been burned once with the 68000, they were uncertain about Motorola’s commitment and ability to keep evolving the 88000 over the long term.

It made a lot of sense in the abstract for any company interested in RISC technology, as Apple certainly was, to contact IBM; it was actually IBM who had invented the RISC concept back in the mid-1970s. Not all that atypically for such a huge company with so many ongoing research projects, they had employed the idea for years only in limited, mostly subsidiary usage scenarios, such as mainframe channel controllers. Now, though, they were just introducing a new line of “workstation computers” — meaning extremely high-powered desktop computers, too expensive for the consumer market — which used a RISC chip called the POWER CPU that was the heir to their many years of research in the field. Like the workstations it lay at the heart of, the chip was much too expensive and complex to become the brain of Apple’s next generation of consumer computers, but it might, thought Spindler, be something to build upon. And he knew that, with IBM’s old partnership with Microsoft slowly collapsing into bickering acrimony, Big Blue might just be looking for a new partner.

The back-channel talks were intermittent and hyper-cautious at first, but, as the year wore on and the problems both of the companies faced became more and more obvious, the discussions heated up. The first formal meeting took place in February of 1991 or shortly thereafter, at an IBM facility in Austin, Texas. The Apple people, knowing IBM’s ultra-conservative reputation and wishing to make a good impression, arrived neatly groomed and dressed in three-piece suits, only to find their opposite numbers, having acted on the same motivation, sitting there in jeans and denim shirts.

That anecdote illustrates how very much both sides wanted to make this work. And indeed, the two parties found it much easier to work together than anyone might have imagined. John Sculley, the man who really called the shots at Apple, found that he got along smashingly with Jack Kuehler, to the extent that the two were soon talking almost every day. After beginning as a fairly straightforward discussion of whether IBM might be able and willing to make a RISC chip suitable for the Macintosh, the negotiations just kept growing in scale and ambition, spurred on by both companies’ deep-seated desire to stick it to Microsoft and the Wintel hegemony in any and all possible ways. They agreed to found a joint subsidiary called Taligent, staffed initially with the people from Apple’s Project Jaguar, which would continue to develop a brand new operating system that could be licensed by any hardware maker, just like MS-DOS and Windows (and for that matter IBM’s already extant OS/2). And they would found another subsidiary called Kaleida Labs, to make a cross-platform multimedia scripting engine called ScriptX.

Still, the core of the discussions remained IBM’s POWER architecture — or rather the PowerPC, as the partners agreed to call the cost-reduced, consumer-friendly version of the chip. Apple soon pulled Motorola into these parts of the talks, thus turning a bilateral into a trilateral negotiation, and providing the name for their so-called “AIM alliance” — “AIM” for Apple, IBM, and Motorola. IBM had never made a mass-market microprocessor of their own before, noted Apple, and Motorola’s experience could serve them well, as could their chip-fabrication facilities once actual production began. The two non-Apple parties were perhaps less excited at the prospect of working together — Motorola in particular must have been smarting at the rejection of their own 88000 processor which this new plan would entail — but made nice and got along.

Jack Kuehler and John Sculley brandish what they call their “marriage certificate,” looking rather disturbingly like Neville Chamberlain declaring peace in our time. The marriage would not prove an overly long or happy one.

On October 2, 1991 — just six weeks after the first 68040-based Macintosh models had shipped — Apple and IBM made official the rumors that had been swirling around for months. At a joint press briefing held inside the Fairmont Hotel in downtown San Francisco, they trumpeted all of the initiatives I’ve just described. The Deal of the Century, they said, would usher in the next phase of personal computing. Wintel must soon give way to the superiority of a PowerPC-based computer running a Taligent operating system with ScriptX onboard. New Apple Macintosh models would also use the PowerPC, but the relationship between them and these other, Taligent-powered machines remained vague.

Indeed, it was all horribly confusing. “What Taligent is doing is not designed to replace the Macintosh,” said Sculley. “Instead we think it complements and enhances its usefulness.” But what on earth did that empty corporate speak even mean? When Apple said out of the blue that they were “not going to do to the Macintosh what we did to the Apple II” — i.e., orphan it — it rather made you suspect that that was exactly what they meant to do. And what did it all mean for IBM’s OS/2, which Big Blue had been telling a decidedly unconvinced public was also the future of personal computing for several years now? “I think the message in those agreements for the future of OS/2 is that it no longer has a future,” said one analyst. And then, what was Kaleida and this ScriptX thing supposed to actually do?

So much of the agreement seemed so hopelessly vague. Compaq’s vice president declared that Apple and IBM must be “smoking dope. There’s no way it’s going to work.” One pundit called the whole thing “a con job. There’s no software, there’s no operating system. It’s just a last gasp of extinction by the giants that can’t keep up with Intel.” Apple’s own users were baffled and consternated by this sudden alliance with the company which they had been schooled to believe was technological evil incarnate. A grim joke made the rounds: what do you get when you cross Apple and IBM? The answer: IBM.

While the journalists reported and the pundits pontificated, it was up to the technical staff at Apple, IBM, and Motorola to make PowerPC computers a reality. Like their colleagues who had negotiated the deal, they all got along surprisingly well; once one pushed past the surface stereotypes, they were all just engineers trying to do the best work possible. Apple’s management wanted the first PowerPC-based Macintosh models to ship in January of 1994, to commemorate the platform’s tenth anniversary by heralding a new technological era. The old Project Cognac team, now with the new code name of “Piltdown Man” after the famous (albeit fraudulent) “missing link” in the evolution of humanity, was responsible for making this happen. For almost a year, they worked on porting MacOS to the PowerPC, as they’d previously done to the 88000. This time, though, they had no real hardware with which to work, only specifications and software emulators. The first prototype chips finally arrived on September 3, 1992, and they redoubled their efforts, pulling many an all-nighter. Thus MacOS booted up to the desktop for the first time on a real PowerPC-based machine just in time to greet the rising sun on the morning of October 3, 1992. A new era had indeed dawned.

Their goal now was to make a PowerPC-based Macintosh work exactly like any other, only faster. MacOS wouldn’t even get a new primary version number for the first PowerPC release; this major milestone in Mac history would go under the name of System 7.1.2, a name more appropriate to a minor maintenance release. It looked so identical to what had come before that its own creators couldn’t spot the difference; they wound up lighting up a single extra pixel in the PowerPC version just so they could know which was which.

Their guiding rule of an absolutely seamless transition applied in spades to the 68000 emulation layer, duly ported from the 88000 to the PowerPC. An ordinary user should never have to think about — should not even have to know about — the emulation that was happening beneath the surface. Another watershed moment came in June of 1993, when the team brought a PowerPC prototype machine to the MacHack, a coding conference and competition. Without telling any of the attendees what was inside the machine, the team let them use it to demonstrate their boundary-pushing programs. The emulation layer performed beyond their most hopeful prognostications. It looked like the Mac’s new lease on life was all but a done deal from the engineering side of things.

But alas, the bonhomie exhibited by the partner companies’ engineers and programmers down in the trenches wasn’t so marked in their executive suites after the deal was signed. The very vagueness of so many aspects of the agreement had papered over what were in reality hugely different visions of the future. IBM, a company not usually given to revolutionary rhetoric, had taken at face value the high-flown words spoken at the announcement. They truly believed that the agreement would mark a new era for personal computing in general, with a new, better hardware architecture in the form of PowerPC and an ultra-modern operating system to run on it in the form of Taligent’s work. Meanwhile it was becoming increasingly clear that Apple’s management, who claimed to be changing the world five times before breakfast on most days, had in reality seen Taligent largely as a hedge in case their people should prove unable to create a PowerPC Macintosh that looked like a Mac, felt like a Mac, and ran vintage Mac software. As Project Piltdown Man’s work proceeded apace, Apple grew less and less enamored with those other, open-architecture ideas IBM was pushing. The Taligent people didn’t help their cause by falling headfirst into a pit of airy computer-science abstractions and staying mired there for years, all while Project Piltdown Man just kept plugging away, getting things done.

The first two and a half years of the 1990s were marred by a mild but stubborn recession in the United States, during which the PC industry had a particularly hard time of it. After the summer of 1992, however, the economy picked up steam and consumer computing eased into what would prove its longest and most sustained boom of all time, borne along on a wave of hype about CD-ROM and multimedia, along with the simple fact that personal computers in general had finally evolved to a place where they could do useful things for ordinary people in a reasonably painless way. (A bit later in the boom, of course, the World Wide Web would come along to provide the greatest impetus of all.)

And yet the position of both Apple and IBM in the PC marketplace continued to get steadily worse while the rest of their industry soared. At least 90 percent of the computers that were now being sold in such impressive numbers ran Microsoft Windows, leaving OS/2, MacOS, and a few other oddballs to divide the iconoclasts, the hackers, and the non-conformists of the world among themselves. While IBM continued to flog OS/2, more out of stubbornness than hope, Apple tried a little bit of everything to stop the slide in market share and remain relevant. Still not entirely certain whether their future lay with open architectures or their own closed, proprietary one, they started porting selected software to Windows, including most notably QuickTime, their much-admired tool for encoding and playing video. They even shipped a Mac model that could also run MS-DOS and Windows, thanks to an 80486 housed in its case alongside its 68040. And they entered into a partnership with the networking giant Novell to port MacOS itself to Intel hardware — a partnership that, like many Apple initiatives of these years, petered out without ultimately producing much of anything. Perhaps most tellingly of all, this became the only period in Apple’s history when the company felt compelled to compete solely on price. They started selling Macs in department stores for the first time, where a stream of very un-Apple-like discounts and rebates greeted prospective buyers.

While Apple thus toddled along without making much headway, IBM began to annihilate all previous conceptions of how much money a single company could possibly lose, posting oceans of red that looked more like the numbers found in macroeconomic research papers than entries in an accountant’s books. The PC marketplace was in a way one of their smaller problems. Their mainframe business, their real bread and butter since the 1950s, was cratering as customers fled to the smaller, cheaper computers that could often now do the jobs of those hulking giants just as well. In 1991, when IBM first turned the corner into loss, they did so in disconcertingly convincing fashion: they lost $2.82 billion that year. And that was only the beginning. Losses totaled $4.96 billion in 1992, followed by $8.1 billion in 1993. IBM lost more money during those three years alone than any other company in the history of the world to that point; their losses exceeded the gross domestic product of Ecuador.

The employees at both Apple and IBM paid the toll for the confusions and prevarications of these years: both companies endured rounds of major layoffs. Those at IBM marked the very first such in the long history of the company. Big Blue had for decades fostered a culture of employment for life; their motto had always been, “If you do your job, you will always have your job.” This, it was now patently obvious, was no longer the case.

The bloodletting at both companies reached their executive suites as well within a few months of one another. On April 1, 1993, John Akers, the CEO of IBM, was ousted after a seven-year tenure which one business writer called “the worst record of any chief executive in the history of IBM.” Three months later, following a terrible quarterly earnings report and a drop in share price of 58 percent in the span of six months, Michael Spindler replaced John Sculley as the CEO of Apple.

These, then, were the storm clouds under which the PowerPC architecture became a physical reality.

The first PowerPC computers to be given a public display bore an IBM rather than an Apple logo on their cases. They arrived at the Comdex trade show in November of 1993, running a port of OS/2. IBM also promised a port of AIX — their version of the Unix operating system — while Sun Microsystems announced plans to port their Unix-based Solaris operating system and, most surprisingly of all, Microsoft talked about porting over Windows NT, the more advanced, server-oriented version of their world-conquering operating environment. But, noted the journalists present, “it remains unclear whether users will be able to run Macintosh applications on IBM’s PowerPC” — a fine example of the confusing messaging the two alleged allies constantly trailed in their wake. Further, there was no word at all about the status of the Taligent operating system that was supposed to become the real PowerPC standard.

Meanwhile over at Apple, Project Piltdown Man was becoming that rarest of unicorns in tech circles: a major software-engineering project that is actually completed on schedule. The release of the first PowerPC Macs was pushed back a bit, but only to allow the factories time to build up enough inventory to meet what everyone hoped would be serious consumer demand. Thus the “Power Macs” made their public bow on March 14, 1994, at New York City’s Lincoln Center, in three different configurations clocked at speeds between 60 and 80 MHz. Unlike IBM’s machines, which were shown six months before they shipped, the Power Macs were available for anyone to buy the very next day.

The initial trio of Power Macs.

This speed test, published in MacWorld magazine, shows how all three of the Power Mac machines dramatically outperform top-of-the-line Pentium machines when running native code.

They were greeted with enormous excitement and enthusiasm by the Mac faithful, who had been waiting anxiously for a machine that could go head-to-head with computers built around Intel’s new Pentium chip, the successor to the 80486. This the Power Macs could certainly do; by some benchmarks at least, the PowerPC doubled the overall throughput of a Pentium. World domination must surely be just around the corner, right?

Predictably enough, the non-Mac-centric technology press greeted the machines’ arrival more skeptically than the hardcore Mac-heads. “I think Apple will sell [a] million units, but it’s all going to be to existing Mac users,” said one market researcher. “DOS and Windows running on Intel platforms is still going to be 85 percent of the market. [The Power Mac] doesn’t give users enough of a reason to change.” Another noted that “the Mac users that I know are not interested in using Windows, and the Windows users are not interested in using the Mac. There has to be a compelling reason [to switch].”

In the end, these more guarded predictions proved the most accurate. Apple did indeed sell an impressive spurt of Power Macs in the months that followed, but almost entirely to the faithful. One might almost say that they became a victim of Project Piltdown Man’s success: the Power Mac really did seem exactly like any other Macintosh, except that it ran faster. And even this fact could be obscured when running legacy applications under emulation, as most people were doing in the early months: despite Project Piltdown Man’s heroic efforts, applications like Excel, Word, and Photoshop actually ran slightly slower on a Power Mac than on a top-of-the-line 68040-based machine. So, while the transition to PowerPC allowed the Macintosh to persist as a viable computing platform, it ultimately did nothing to improve upon its small market share. And because the PowerPC MacOS was such a direct and literal port, it still retained all of the shortcomings of MacOS in general. It remained a pretty interface stretched over some almost laughably archaic plumbing. The new generation of Mac hardware wouldn’t receive an operating system truly, comprehensively worthy of it until OS X arrived seven long years later.

Still, these harsh realities shouldn’t be allowed to detract from how deftly Apple — and particularly the unsung coders of Project Piltdown Man — executed the transition. No one before had ever picked up a consumer-computing platform bodily and moved it to an entirely new hardware architecture at all, much less done it so transparently that many or most users never really had to think about what was happening at all. (There would be only one comparable example in computing’s future. And, incredibly, the Mac would once again be the platform in question: in 2006, Apple would move from the fading PowerPC line to Intel’s chips — if you can’t beat ’em, join ’em, right? — relying once again on a cleverly coded software emulator to see them through the period of transition. The Macintosh, it seems, has more lives than Lazarus.)

Although the briefly vaunted AIM alliance did manage to give the Macintosh a new lease on life, it succeeded in very little else. The PowerPC architecture, which had cost the alliance more than $1 billion to develop, went nowhere in its non-Mac incarnations. IBM’s own machines sold in such tiny numbers that the question of whether Apple would ever allow them to run MacOS was all but rendered moot. (For the record, though: they never did.) Sun Solaris and Microsoft Windows NT did come out in PowerPC versions, but their sales couldn’t justify their existence, and within a year or two they went away again. The bold dream of creating a new reference platform for general-purpose computing to rival Wintel never got off the ground, as it became painfully clear that said dream had been taken more to heart by IBM than by Apple. Only after the millennium would the PowerPC architecture find a measure of mass-market success outside the Mac, when it was adopted by Nintendo, Microsoft, and Sony for use in videogame consoles. In this form, then, it finally paid off for IBM; far more PowerPC-powered consoles than even Macs were sold over the lifetime of the architecture. PowerPC also eventually saw use in other specialized applications, such as satellites and planetary rovers employed by NASA.

Success, then, is always relative. But not so the complete lack thereof, as Kaleida and Taligent proved. Kaleida burned through $200 million before finally shipping its ScriptX multimedia-presentation engine years after other products, most notably Macromedia’s Director, had already sewn up that space; it was disbanded and harvested for scraps by Apple in November of 1995. Taligent burned through a staggering $400 million over the same period of time, producing only some tepid programming frameworks in lieu of the revolutionary operating system that had been promised, before being absorbed back into IBM.

There is one final fascinating footnote to this story of a Deal of the Century that turned out to be little more than a strange anecdote in computing history. In the summer of 1994, IBM, having by now stopped the worst of the bleeding, settling by now into their new life as a smaller, far less dominant company, offered to buy Apple outright for a premium of $5 over their current share price. In IBM’s view, the synergies made sense: the Power Macs were selling extremely well, which was more than could be said for IBM’s PowerPC models. Why not go all in?

Ironically, it was those same healthy sales numbers that scuppered the deal in the end. If the offer had come a year earlier, when a money-losing Apple was just firing John Sculley, they surely would have jumped at it. But now Apple was feeling their oats again, and by no means entirely without reason; sales were up more than 20 percent over the previous year, and the company was once more comfortably in the black. So, they told IBM thanks, but no thanks. The same renewed taste of success also caused them to reject serious inquiries from Philips, Sun Microsystems, and Oracle. Word had it that new CEO Michael Spindler was convinced not only that the Power Mac had saved Apple, but that it had fundamentally altered their position in the marketplace.

The following year revealed how misguided that thinking really was; the Power Mac had fixed none of Apple’s fundamental problems. That year it was Microsoft who cemented their world domination instead, with the release of Windows 95, while Apple grappled with the reality that almost all of those Power Mac sales of the previous year had been to existing members of the Macintosh family, not to the new customers they so desperately needed to attract. What happened now that everyone in the family had dutifully upgraded? The answer to that question wasn’t pretty: Apple plunged off a financial cliff as precipitous in its own way as the one which had nearly destroyed IBM a few years earlier. Now, nobody was interested in acquiring them anymore. The pundits smelled the stink of death; it’s difficult to find an article on Apple written between 1995 and 1998 which doesn’t include the adjective “beleaguered.” Why buy now when you can sift through the scraps at the bankruptcy auction in just a little while?

Apple didn’t wind up dying, of course. Instead a series of improbable events, beginning with the return of prodigal-son Steve Jobs in 1997, turned them into the richest single company in the world — yes, richer even than Microsoft. These are stories for other articles. But for now, it’s perhaps worth pausing for a moment to think about an alternate timeline where the Macintosh became an IBM product, and the Deal of the Century that got that ball rolling thus came much closer to living up to its name. Bizarre, you say? Perhaps. But no more bizarre than what really happened.

(Sources: the books Insanely Great: The Life and Times of Macintosh by Steven Levy, Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Company by Owen W. Linzmayer, Infinite Loop: How the World’s Most Insanely Great Computer Company Went Insane by Michael S. Malone, Big Blues: The Unmaking of IBM by Paul Carroll, and The PowerPC Macintosh Book by Stephan Somogyi; InfoWorld of September 24 1990, October 15 1990, December 3 1990, April 8 1991, May 13 1991, May 27 1991, July 1 1991, July 8 1991, July 15 1991, July 22 1991, August 5 1991, August 19 1991, September 23 1991, September 30 1991, October 7 1991, October 21 1991, November 4 1991, December 30 1991, January 13 1992, January 20 1992, February 3 1992, March 9 1992, March 16 1992, March 23 1992, April 27 1992, May 11 1992, May 18 1992, June 15 1992, June 29 1992, July 27 1992, August 3 1992, August 10 1992, August 17 1992, September 7 1992, September 21 1992, October 5 1992, October 12 1992, October 19 1992, December 14 1992, December 21 1992, December 28 1992, January 11 1993, February 1 1993, February 22 1993, March 8 1993, March 15 1993, April 5 1993, April 12 1993, May 17 1993, May 24 1993, May 31 1993, June 21 1993, June 28 1993, July 5 1993, July 12 1993, July 19 1993, August 2 1993, August 9 1993, August 30 1993, September 6 1993, September 27 1993, October 4 1993, October 11 1993, October 18 1993, November 1 1993, November 15 1993, November 22 1993, December 6 1993, December 13 1993, December 20 1993, January 10 1994, January 31 1994, March 7 1994, March 14 1994, March 28 1994, April 25 1994, May 2 1994, May 16 1994, June 6 1994, June 27 1994; MacWorld of September 1992, February 1993, July 1993, September 1993, October 1993, November 1993, February 1994, and May 1994; Byte of November 1984. Online sources include IBM’s own corporate-history timeline and a vintage IBM lecture on the PowerPC architecture.)

 
 

Tags: , ,

Doing Windows, Part 8: The Outsiders

Microsoft Windows 3.0’s conquest of the personal-computer marketplace was bad news for a huge swath of the industry. On the software side, companies like Lotus and WordPerfect, only recently so influential that it was difficult to imagine a world that didn’t include them, would never regain the clout they had enjoyed during the 1980s, and would gradually fade away entirely. On the hardware side, it was true that plenty of makers of commodity PC clones were happier to work with a Microsoft who believed a rising tide lifted all their boats than against an IBM that was continually trying to put them out of business. But what of Big Blue themselves, still the biggest hardware maker of all, who were accustomed to dictating the direction of the industry rather than being dictated to by any mere maker of software? And what, for that matter, of Apple? Both Apple and IBM found themselves in the unaccustomed position of being the outsiders in this new Windows era of computing. Each must come to terms with Microsoft’s newfound but overwhelming power, even as each remained determined not to give up the heritage of innovation that had gotten them this far.

Having chosen to declare war on Microsoft in 1988, Apple seemed to have a very difficult road indeed in front of them — and that was before Xerox unexpectedly reentered the picture. On December 14, 1989, the latter shocked everyone by filing a $150 million lawsuit of their own, accusing Apple of ripping off the user interface employed by the Xerox Star office system before Microsoft allegedly ripped the same thing off from Apple.

The many within the computer industry who had viewed the implications of Apple’s recent actions with such concern couldn’t help but see this latest development as the perfect comeuppance for their overweening position on “look and feel” and visual copyright. These people now piled on with glee. “Apple can’t have it both ways,” said John Shoch, a former Xerox PARC researcher, to the New York Times. “They can’t complain that Microsoft [Windows has] the look and feel of the Macintosh without acknowledging the Mac has the look and feel of the Star.” In his 1987 autobiography, John Sculley himself had written the awkward words that “the Mac, like the Lisa before it, was largely a conduit for technology” developed by Xerox. How exactly was it acceptable for Apple to become a conduit for Xerox’s technology but unacceptable for Microsoft to become a conduit for Apple’s? “Apple is running around persecuting Microsoft over things they borrowed from Xerox,” said one prominent Silicon Valley attorney. The Xerox lawsuit raised uncomfortable questions of the sort which Apple would have preferred not to deal with: questions about the nature of software as an evolutionary process — ideas building upon ideas — and what would happen to that process if everyone started suing everyone else every time somebody built a better mousetrap.

Still, before we join the contemporary commentators in their jubilation at seeing Apple hoisted with their own petard, we should consider the substance of this latest case in more detail. Doing so requires that we take a closer look at what Xerox had actually created back in the day, and take particularly careful note of which of those creations was named in their lawsuit.

Broadly speaking, Xerox created two different GUI environments in the course of their years of experimentation in this area. The first and most heralded of these was known as the Smalltalk environment, pioneered by the researcher Alan Kay in 1975 on a machine called the Xerox Alto, which had been designed at PARC and was built only in limited quantities, without ever being made available for sale through traditional commercial channels. This was the machine and the environment which Steve Jobs so famously saw on his pair of visits to PARC in December of 1979 — visits which directly inspired first the Apple Lisa and later the Macintosh.

The Smalltalk environment running on a Xerox Alto, a machine built at Xerox PARC in the mid-1970s but never commercially released. Many of the basic ideas of the GUI are here, but much remains to be developed and much is implemented only in a somewhat rudimentary way. For instance, while windows can overlap one another, windows that are obscured by other windows are never redrawn. In this way the PARC researchers neatly avoided one of the most notoriously difficult aspects of implementing a windowing system. When Apple programmer Bill Atkinson was part of the delegation who made that December 1979 visit to PARC, he thought he did see windows that continued to update even when partially obscured by other windows. He then proceeded to find a way to give the Lisa and Macintosh’s windowing engine this capability. Seldom has a misunderstanding had such a fortuitous result.

Xerox’s one belated attempt to parlay PARC’s work on the GUI into a real commercial product took the form of the Xerox Star, an integrated office-productivity system costing $16,500 per workstation upon its release in 1981. Neither Kay nor most of the other key minds behind the Alto and Smalltalk were involved in its development. Yet its GUI strikes modern eyes as far more refined than that of Smalltalk. Importantly, the metaphor of the desktop, and the soon-to-be ubiquitous idea of a skeuomorphic user interface built from stand-ins for real-world office equipment — a trash can, file folders, paper documents, etc. — were apparently the brainchildren of the product-focused Star team rather than the blue-sky researchers who worked at PARC during the 1970s.

The Xerox Star office system, which was released in 1981. This system looks much more familiar to our modern eyes than the Xerox Alto’s Smalltalk, sporting such GUI staples as menus, widgets, and icons. Yet it was still lacking in many areas compared to the GUIs that would follow. Windows were neither free-dragging nor overlapping, and its menus were one-shot commands, not drop-down lists. It most resembles VisiCorp’s Visi On among the GUIs we’ve looked at closely in this series of articles. Both products serve as a telling snapshot of the state of the art in GUIs just before Apple shook everything up with the Lisa and Macintosh.

The Star, which failed dismally due to its high price and Xerox’s lack of marketing acumen, is often reduced to little more than a footnote to the story of PARC, treated as a workmanlike translation of PARC’s grand ideas and technologies into a somewhat problematic product. Yet there’s actually an important philosophical difference between Smalltalk and the Star, born of the different engineering cultures that produced them. Smalltalk emphasized programming, to the point that the environment could literally be re-programmed on the fly as you used it. This was very much in keeping with the early ethos of home computing as well, when all machines booted into BASIC and an ability to program was considered key for every young person’s future — when every high school, it seemed, was instituting classes in BASIC or Pascal. The Star, on the other hand, was engineered to ensure that the non-technical office worker never needed to see a line of code; this machine conformed to the human rather than asking the human to conform to it. One might say that Smalltalk was intended to make the joy of computing — of using the computer as the ultimate anything machine — as accessible as possible, while the Star was intended to make you forget that you were using a computer at all.

While I certainly don’t wish to dismiss or minimize the visionary work down at PARC in the 1970s, I do believe that historians of early microcomputer GUIs have tended to somewhat over-emphasize the innovations of Smalltalk and the Alto while selling the Xerox Star’s influence rather short. Steve Jobs’s early visits to PARC are given much weight in the historical record, but it’s sometimes forgotten that anything Apple wished to copy from Smalltalk had to be done from memory; they had no regular access to the PARC technology after those visits. The Star, on the other hand, did ship as a commercial product some two years before the Lisa. Notably, the Star’s philosophy of hiding the “computery” aspects of computing from the user would turn out to be much more in line with the one that guided the Lisa and Macintosh than was Smalltalk’s approach of exposing its innards for all to see and modify. The Star was a closed black box, capable of running only the software provided for it by Xerox. Similarly, the Lisa couldn’t be programmed at all except by buying a second Lisa and chaining the two machines together, and even the Macintosh never had the reputation of being a hacker’s plaything in the way of the earlier, more hobbyist-oriented Apple II. The Lisa and Macintosh thus joined the Star in embracing a clear divide between coding professionals, who wrote the software, and end users, who bought it and used it to get stuff done. One could thus say that they resemble the Star much more than Smalltalk not only visually but philosophically.

Counter-intuitive though it is to the legend of the Macintosh being a direct descendant of the work Steve Jobs saw at PARC, Xerox sued Apple over the interface elements they had allegedly stolen from the Star rather than Smalltalk. In evaluating the merits of their claim today, I’m somewhat hamstrung by the fact that no working emulators of the original Star exist,[1]This has changed since this article was written; see Ian Crossfield’s comment below. forcing me to rely on screenshots, manuals, and contemporary articles about the system. Nevertheless, those sources are enough to identify an influence of the Star upon the Macintosh that’s every bit as clear-cut as that of the Macintosh upon Microsoft Windows. It strains the bounds of credibility to believe that the Mac team coincidentally developed a skeuomorphic interface using many of the very same metaphors — including the central metaphor of the desktop — without taking the example of the Star to heart. To this template they added much innovation, including such modern GUI staples as free-dragging and overlapping windows, drop-down menus, and draggable icons, along with staple mouse gestures like the hold-and-drag and the double-click. Nonetheless, the foundations of the Mac can be seen in the Star much more obviously than they can in Smalltalk. Crudely put, Apple copied the Star while adding a whole lot of original ideas to the mix, and then Microsoft copied Apple, adding somewhat fewer ideas of their own. The people rejoicing over the Xerox lawsuit, in other words, had this aspect of the story basically correct, even if they did have a tendency to confuse Smalltalk and the Star and misunderstand which of them Xerox was actually suing over.

MacOS started with the skeuomorphic desktop model of the Xerox Star and added it to such fundamental modern GUI concepts as pull-down menus, hold-and-drag, the double-click, and free-dragging, overlapping windows that update themselves even when partially occluded by others.

Of course, the Xerox lawsuit against Apple was legally suspect for all the same reasons as the Apple lawsuit against Microsoft. If anything, there were even more reasons to question the good faith of Xerox’s lawsuit than Apple’s. The source of Xerox’s sudden litigiousness was none other than Bill Lowe, the former IBM executive whose disastrous PS/2 brainchild had already made his attitude toward intellectual property all too clear. Lowe had made a soft landing at Xerox after leaving IBM, and was now telling the press about the “aggressive stand on copyright and patent issues” his new company would be taking from now on. It certainly sounded like he intended to weaponize the long string of innovations credited to Xerox PARC and the Star — using these ideas not to develop products, but to sue others who dared to do so. Lowe’s hoped-for endgame was weirdly similar to his misbegotten hopes for the PS/2’s Micro Channel Architecture: Xerox would eventually license the right to make GUIs and other products to companies like Apple and Microsoft, profiting off their innovations of the past without having to do much of anything in the here and now. This understandably struck many of the would-be licensees as a less than ideal outcome. That, at least, was something on which Apple, Microsoft, and just about everyone else in the computer industry could agree.

Apple’s legal team was left in one heck of an awkward fix. They would seemingly have to argue against Xerox’s broad interpretation of visual copyright while arguing for that same broad interpretation in their own lawsuit against Microsoft — and all in the same court in front of the same judge. Any victory against Xerox could lead to their own words being used against them to precipitate a loss against Microsoft, and vice versa.

It was therefore extremely fortunate for Apple that Judge Vaughn R. Walker struck down Xerox’s lawsuit almost before it had gotten started. At the time of their court filing, Xerox was already outside the statute of limitations for a copyright-infringement claim of the type that Apple had filed against Microsoft. They had thus been forced to make a claim of “unfair competition” instead — a claim which carried with it a much higher evidentiary standard. On March 24, 1990, Judge Walker tossed the Xerox lawsuit, saying it didn’t meet this standard and making the unhelpful observation to Xerox that it would have made a lot more sense as a copyright claim. Apple had dodged a bullet, and Bill Lowe would have to find some other way to make money for his new company.

With the Xerox sideshow thus dispensed with, Apple’s lawyers could turn their attention back to the main event, their case against Microsoft. The same Judge Walker who had decided in their favor against Xerox had taken over from Judge William Schwarzer in the other case as well. No longer needing to worry about protecting their flank from Xerox, Apple’s lawyers pushed for what they called “total concept” or “gestalt” look and feel as the metric for deciding whether Windows infringed upon MacOS. But on March 6, 1991, Judge Walker agreed with Microsoft’s contention that the case should be decided on a “function by function” basis instead. Microsoft began assembling reels of video demonstrating what they claimed to be pre-Macintosh examples of each one of the ten interface elements that were at issue in the case.

So, even as Windows 3.0 was conquering the world outside the courtroom, both sides remained entrenched in their positions inside it, and the case, already three years old, ground on and on through motion after counter-motion. “We’re going to trial,” insisted Edward B. Stead, Apple’s general counsel, but it wasn’t at all clear when that trial would take place. Part of the problem was the sheer pace of external events. As Windows 3.0 became the fastest-selling piece of commercial software the world had ever seen, the scale and scope of Apple’s grievances just kept growing to match. From the beginning, a key component of Microsoft’s strategy had been to gum up the works in court while Windows 3.0 became a fait accompli, the new standard in personal computing, too big for any court to dare attack. That strategy seemed to be working beautifully. Meanwhile Apple’s motions grew increasingly far-fetched, beginning to take on a distinct taint of desperation.

In May of 1991, for example, Apple’s lawyers surprised everyone with a new charge. Still looking for a way to expand the case beyond those aspects of Windows 2 and 3 which hadn’t existed in Windows 1, they now claimed that the 1985 agreement which had been so constantly troublesome to them in that respect was invalid. Microsoft had allegedly defrauded Apple by saying they wouldn’t make future versions of Windows any more similar to the Macintosh than the first was, and then going against their word. This new charge was a hopeful exercise at best, especially given that the agreement Apple claimed Microsoft had broken had been, if it ever existed, strictly a verbal one; absolutely no language to this effect was to be found in the text of the 1985 agreement. Microsoft’s lawyers, once they picked their jaws up off the floor, were left fairly spluttering with indignation. Attorney David T. McDonald labeled the argument “desperate” and “preposterous”: “We’re on the five-yard line, the goal is in sight, and Apple now shows up and says, ‘How about lacrosse instead of football?'” Thankfully, Judge Walker found Apple’s argument to be as ludicrous as McDonald did, thus sparing us all any more sports metaphors.

On April 14, 1992 — now more than four years on from Apple’s original court filing, in a computing climate transformed almost beyond recognition by the rise of Windows — Judge Walker ruled against Apple’s remaining contentions in devastating fashion. Much of the 1985 agreement was indeed invalid, he said, but not for the reason Apple had claimed. What Microsoft had licensed in that agreement were largely “generic ideas” that should never be susceptible to copyright protection in the first place. Apple was entitled to protect very specific visual elements of their displays, such as the actual icons they used, but they weren’t entitled to protect the notion of a screen with icons in the abstract, nor even that of icons representing specific real-world objects, such as a disk, a folder, or a trash can. Microsoft or anyone else could, in other words, make a GUI with a trash-can icon if they wished; they just couldn’t transplant Apple’s specific rendering of a trash can into their own work. Applying the notion of visual copyright any more broadly than this “would afford too much protection and yield too little competition,” said the judge. Apple’s slippery notion of look and feel, it appeared, was dead as a basis for copyright. After all the years of struggle and at least $10 million in attorney fees on both sides, Judge Walker ruled that Apple’s case was too weak to even present before a jury. “Through five years, there were many points where the case got continuously refined and focused and narrowed,” said a Microsoft spokesman. “Eventually, there was nothing left.”

Still, one can’t accuse Apple of giving up without a fight. They dragged the case out for almost three more years after this seemingly definitive defeat. When the Ninth Circuit Court of Appeals upheld Judge Walker’s judgment in 1994, Apple tried to take the case all the way to the Supreme Court. That august body announced that they would not hear it on February 21, 1995, thus finally putting an end to the whole tortuous odyssey.

The same press which had been so consumed by the case circa 1988 barely noticed its later developments. The narrative of Microsoft’s utter dominance and Apple’s weakness had become so prevalent by the early 1990s that it had become difficult to imagine any outcome other than a Microsoft victory. Yet the case’s anticlimactic ending obscured how dangerous it had once been, not only for Microsoft but for the software industry as a whole. Whatever one thinks in general of the products and business practices of the opposing sides, a victory for Apple would have been a terrible result for the personal-computer industry. The court got this one right in striking all of Apple’s claims down so thoroughly — something that can’t always be said about collisions between technology and the law. Bill Gates could walk away knowing the long struggle had struck an important blow for an ongoing culture of innovation in the software industry. Indeed, like the victory of his hero Henry Ford over a group of automotive patent trolls eighty years before, his victory would benefit his whole industry along with his company — which isn’t to say, of course, that he would have fought the war purely for the sake of altruism.

John Sculley, for his part, was gone from Apple well before the misguided lawsuit he had fostered came to its final conclusion. He was ousted by his board of directors in 1993, after it became clear that Apple would post a loss of close to $200 million for the year. Yet his departure brought no relief to the problems of dwindling market share, dwindling focus, and, most worrisome of all, a dwindling sense of identity. Apple languished, embittered about the ideas Microsoft had “stolen” from them, while Windows conquered the world. One could certainly argue that they deserved a better fate on the basis of a Macintosh GUI that still felt far slicker and more intuitive than Microsoft’s, but the reality was that their own poor decisions, just as much as Microsoft’s ruthlessness, had led them to this sorry place. The mid-1990s saw them mired in the greatest crisis of confidence of their history, licensing the precious Macintosh technology to clone makers and seriously considering breaking themselves up into two companies to appease their angriest shareholder contingents. For several years to come, there would be a real question of whether any part of the company would survive to see the new millennium. Gone were the Jobsian dreams of changing the world through better computing; Apple was reduced to living on Microsoft’s scraps. Microsoft had won in the marketplace as thoroughly as they had in court.

But the full story of Apple’s 1990s travails is one to take up at another time. Now, we should turn to IBM, to see how they coped after the MS-DOS-based Windows, rather than the OS/2-based Presentation Manager, made the world safe for the GUI.

Throughout 1990, that year of wall-to-wall hype over Windows 3.0, Microsoft persisted in dampening expectations for OS/2 in a way that struck IBM as deliberate. The agreement that MS-DOS and Windows were for low-end computers, OS/2 and the Presentation Manager for high-end ones, seemed to have been forgotten by Microsoft as soon as Bill Gates and Steve Ballmer left the Fall 1989 Comdex at which it had been announced. Gates now said that it could take OS/2 another three or four years to inherit the throne from MS-DOS, and by that time it would probably be running Windows rather than Presentation Manager anyway. Ballmer said that OS/2 was really meant to compete with high-end client/server operating systems like Unix, not with desktop operating systems like MS-DOS. They both said that “there will be a DOS 5, 6, and 7, and a Windows 4 and 5.” Meanwhile IBM was predictably incensed by Windows 3.0’s use of protected mode and the associated shattering of the 640 K barrier; that sort of thing was supposed to have been the purview of the more advanced OS/2.

Back in late 1988, Microsoft had hired a system-software architect from DEC named David Cutler to oversee the development of OS/2 2.0. No shrinking violet, he promptly threw out virtually all of the existing OS/2 code, which he pronounced a bloated mess, and started over from scratch on an operating system that would fulfill Microsoft’s original vision for OS/2, being targeted at machines with an 80386 or better processor. The scope and ambition of this project, along with the fact that Microsoft wished to keep it entirely in-house, had turned into yet one more source of tension between the two companies; it could be years still before Cutler’s OS/2 2.0 was ready. There remained little semblance of any coordinated strategy between the two companies, in public or in private.

And yet, in September of 1990, IBM and Microsoft announced a new roadmap for OS/2’s future. The two companies together would finish up one more version of the first-generation OS/2 — OS/2 1.3, which was scheduled to ship the following month — and that would be the end of that lineage. Then IBM would develop an OS/2 2.0 alone — a project they hoped to have done in a year or so — while Cutler’s team at Microsoft continued with the complete rewrite that was now to be marketed as OS/2 3.0.

The announcement, whose substance amounted to a tacit acknowledgement that the two companies simply couldn’t work together anymore on the same project, caused heated commentary in the press. It seemed a convoluted way to evolve an operating system at best, and it was happening at the same time that Microsoft seemed to be charging ahead — and with massive commercial success at that — on MS-DOS and Windows as the long-term face of personal computing in the 1990s. InfoWorld wrote of a “deepening rift” between Microsoft and IBM, characterizing the latest agreement as IBM “seizing control of OS/2’s future.” “Although in effect IBM and Microsoft will say they won’t divorce ‘for the sake of the children,'” said an inside source to the magazine, “in fact they are already separated, and seeking new relationships.” Microsoft pushed back against the “divorce” meme only in the most tepid fashion. “You may not understand our marriage,” said Steve Ballmer, “but we’re not getting divorced.” (One might note that when a couple have to start telling friends that they aren’t getting a divorce, it usually isn’t a good sign about the state of their relationship…)

Charles Petzold, writing in PC Magazine, summed up the situation created by all the mixed messaging: “The key words in operating systems are confusion, uncertainty, anxiety, and doubt. Unfortunately, the two guiding lights of this industry — IBM and Microsoft — are part of the problem rather than part of the solution.” If anything, this view of IBM as an ongoing “guiding light” was rather charitable.  OS/2 was drowning in the Windows hype. “The success of Windows 3.0 has already caused OS/2 acceptance to go from dismal to cataclysmic,” wrote InfoWorld. “Analysts have now pushed back their estimates of when OS/2 will gain broad popularity to late this decade, with some predicting that the so-called next-generation operating system is all but dead.”

The final divorce of Microsoft from IBM came soon after to give the lie to all of the denials. In July of 1991, Microsoft announced that the erstwhile OS/2 3.0 was to become its own operating system, separate from both OS/2 and MS-DOS, called Windows NT. With this news, which barely made an impression in the press — it took up less than one quarter of page 87 of that week’s InfoWorld — a decade of cooperation came to an end. From now on, Microsoft and IBM would exist strictly as competitors in a marketplace where Microsoft enjoyed all the advantages. In the final divorce settlement, IBM gave up all rights to the upcoming Windows NT and agreed to pay a small royalty on all future sales of OS/2 (whatever those might amount to), while Microsoft paid a lump sum of around $30 million to be free and clear of their last obligations to the computing giant that had made them what they now were. They greeted this watershed moment with no sentimentality whatever. In a memo that leaked to the press, Bill Gates instead rejoiced that Microsoft was finally free of IBM’s “poor code, poor design, and other overhead.”

Even as the unlikely partnership’s decade of dominance was passing away, Microsoft’s decade of sole dominion was just beginning. The IBM PC and its clones had become the Wintel standard, and would require no further input from Big Blue, thank you very much. IBM’s share of the standard’s sales was already down to 17 percent, and would just keep on falling from there. “Microsoft is now driving the industry, not IBM,” wrote the newsletter Software Publishing by way of stating the obvious.

Which isn’t to say that IBM was going away. While Microsoft was celebrating their emancipation, IBM continued plodding forward with OS/2 2.0, which, like the aborted version 3.0 that was now to be known as Windows NT, ran only on an 80386 or better. They made a big deal of the work-in-progress at the Fall 1991 Comdex without managing to change the narrative around it one bit. The total bill for OS/2 was approaching an astonishing $1 billion, and they had very little to show for it. One Wall Street analyst pronounced OS/2 “the greatest disaster in IBM’s history. The reverberations will be felt throughout the decade.”

At the end of that year, IBM had to report — incredibly, for the very first time in their history — an annual loss. And it was no trivial loss either. The deficit was $2.8 billion, on revenues that had fallen 6.1 percent from the year before. The following year would be even worse, to the tune of a $5 billion loss. No company in the history of the world had ever lost this much money this quickly; by the last quarter of 1993, IBM would be losing $45 million every day. Microcomputers were continuing to replace the big mainframes and minicomputers that had once been the heart of IBM’s business. Now, though, fewer and fewer of those replacement machines were IBM personal computers; whole segments of their business were simply evaporating. The vague distrust IBM had evinced toward Microsoft for most of the 1980s now seemed amply justified, as all of their worst nightmares came true. IBM seemed old, bloated, and, worst of all, irrelevant next to the fresh-faced young Microsoft.

OS/2 2.0 started reaching consumers in May of 1992. It was a surprisingly impressive piece of work; perhaps the relationship with Microsoft had been as frustrating for IBM’s programmers as it had been for their counterparts. Certainly OS/2 2.0 was a far more sophisticated environment than Windows 3.0. Being designed to run only on 32-bit microprocessors like the 80386 and 80486, it utilized them to their maximum potential, which was much more than one could say for Windows, while also being much more stable than Microsoft’s notoriously crash-prone environment. In addition to native OS/2 software, it could run multiple MS-DOS applications at the same time with complete compatibility, and, in a new wrinkle added to the mix by IBM, could now run many Windows applications as well. IBM called it “a better DOS than DOS and a better Windows than Windows,” a claim which carried a considerable degree of truth. They pointedly cut its suggested list price of $140 to just $50 for Windows users looking to “upgrade.”

A Quick Tour of OS/2 2.0


Shipping on more than twenty 3.5-inch diskettes, OS/2 2.0 was by far more the most elaborate operating system yet made for its family of personal computers. When we boot it up for the first time, we’re given a lengthy interactive tutorial of a sort that was seldom seen in software of 1992 vintage.

The notion of a “Presentation Manager” GUI that’s separate from the core OS/2 operating system has been dropped; OS/2 is now simply OS/2, with a GUI as the standard, built-in interface. From the opening tutorial to the look of its desktop, the whole package reminds one of nothing of so much as the much later Windows 95. We have a full-fledged, functioning desktop workspace here, with icons representing folders and disks, and a “shredder” to replace the usual trash can.

After shipping earlier versions of OS/2 with no extra tools or applets whatsoever, IBM got wise this time around and included plenty of stuff to play with, like this neat little music editor.

Some aspects of the interface are a little strange. Dragging with the mouse is accomplished using the right button rather than the left — a fine example of OS/2’s superficial similarity and granular dissimilarity to Windows, which so many users who had to move back and forth between the environments found so frustrating.

Of course, MS-DOS is still around if you need it. Unlike in OS/2 1.x, here you can have as many MS-DOS windows and applications open as you like.

But, despite its many merits, OS/2 2.0 was a lost cause from the start, at least if one’s standard for success was Windows. Windows 3.1 rolled out of Microsoft at almost the same instant, and no amount of comparisons in techie magazines pointing out the alternative operating system’s superiority could have any impact on a mass market that was now thoroughly conditioned to accept Windows as the standard. Giant IBM’s operating system had become, as the New York Times put it, “an unlikely underdog.”

In truth, the contest was so lopsided by this point as to be laughable. Microsoft, who had long-established relationships with the erstwhile clone makers — now known as makers of hardware conforming to the Wintel standard — understood early, as IBM did only much too late, that the best and perhaps only way to get your system software widely accepted was to sell it pre-installed on the computers that ran it. Thus, by the time OS/2 2.0 shipped, Windows already came pre-installed on nine out of ten personal computers on the market, thanks to a smart and well-funded “original equipment manufacturer” sales team that was overseen personally by Steve Ballmer. And thus, simply by buying a new computer, one automatically became a Windows user. Running OS/2, on the other hand, required that the purchaser of one of these machines decide to go out and buy an alternative to the perfectly good Microsoft software already on her hard drive, and then go through all the trouble of installing and configuring it. Very few people had the requisite combination of motivation and technical skill for an exercise like that.

As a final indignity, IBM themselves had to bow to customer demand and offer MS-DOS and Windows as an optional alternative to OS/2 on their own machines. People wanted the system software that they used at the office, that their friends had, that could run all of the products on the shelves of their local computer store with 100-percent fidelity (with the exception of that oddball Mac stuff off in the corner, of course). Only the gearheads were going to buy OS/2 because it was a 32-bit instead of a 16-bit operating system or because it offered preemptive instead of cooperative multitasking, and they were a tiny slice of an exploding mass market in personal computing.

That said, OS/2 did have a better fate than many another alternative operating system during this period of Windows, Windows everywhere. It stayed around for years even in the face of that juggernaut, going through two more major revisions and many minor ones, the very last coming as late as December of 2001. It remained always a well-respected operating system that just couldn’t break through Microsoft’s choke hold on mainstream computing, having to content itself with certain niches — powering automatic teller machines was a big one for a long time — where its stability and robustness served it well.

So, IBM, and Apple as well, had indeed become the outsiders of personal computing. They would retain that dubious status for the balance of the decade of the 1990s, offering alternatives to the monoculture of Windows computing that appealed only to the tech-obsessed, the idealistic, or the just plain contrarian. Even as much of what I’ve related in this article was taking place, they were being forced into one another’s arms for the sake of sheer survival. But the story of that second unlikely IBM partnership — an awkward marriage of two corporate cultures even more dissimilar than those of Microsoft and IBM — must, like so much else, be told at another time. All that’s left to tell in this series is the story of how Windows, with the last of its great rivals bested, finished the job of conquering the world.

(Sources: the books The Making of Microsoft: How Bill Gates and His Team Created the World’s Most Successful Software Company by Daniel Ichbiah and Susan L. Knepper, Hard Drive: Bill Gates and the Making of the Microsoft Empire by James Wallace and Jim Erickson, Gates: How Microsoft’s Mogul Reinvented an Industry and Made Himself the Richest Man in America by Stephen Manes and Paul Andrews, Computer Wars: The Fall of IBM and the Future of Global Technology by Charles H. Ferguson and Charles R. Morris, and Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Company by Owen W. Linzmayer; PC Week of September 24 1990 and January 15 1991; InfoWorld of September 17 1990, May 29 1991, July 29 1991, October 28 1991, and September 6 1993; New York Times of December 29 1989, March 24 1990, March 7 1991, May 24 1991, January 18 1992, August 8 1992, January 20 1993, April 19 1993, and June 2 1993; Seattle Times of June 2 1993. Finally, I owe a lot to Nathan Lineback for the histories, insights, comparisons, and images found at his wonderful online “GUI Gallery.”)

Footnotes

Footnotes
1 This has changed since this article was written; see Ian Crossfield’s comment below.
 
 

Tags: , , , , ,

Doing Windows, Part 6: Look and Feel

From left, Dan Fylstra of VisiCorp, Bill Gates of Microsoft, and Gary Kildall of Digital Research in 1984. As usual, Gates looks rumpled, high-strung, and vaguely tortured, while Kildall looks polished, relaxed, and self-assured. (Which of these men would you rather chat with at a party?) Pictures like these perhaps reveal one of the key reasons that Gates consistently won against more naturally charismatic characters like Kildall: he personally needed to win in ways that they did not.

In the interest of clarity and concision, I’ve restricted this series of articles about non-Apple GUI environments to the efforts of Microsoft and IBM, making an exception to that rule only for VisiCorp’s Visi On, the very first product of its type. But, as I have managed to acknowledge in passing, those GUIs hardly constituted the sum total of the computer industry’s efforts in this direction. Among the more impressive and prominent of what we might label the alternative MS-DOS GUIs was a product from none other than Gary Kildall and Digital Research — yes, the very folks whom Bill Gates once so slyly fleeced out of a contract to provide the operating system for the first IBM PC.

To his immense credit, Kildall didn’t let the loss of that once-in-a-lifetime opportunity get him down for very long. Digital Research accepted the new MS-DOS-dominated order with remarkable alacrity, and set about making the best of things by publishing system software, such as the multitasking Concurrent DOS, which tried to do the delicate dance of improving on MS-DOS while maintaining compatibility. In the same spirit, they made a GUI of their own, called the “Graphics Environment Manager” — GEM.

After futzing around with various approaches, the GEM team found their muse on the day in early 1984 when team-member Darrell Miller took Apple’s new Macintosh home to show his wife: “Her eyes got big and round, and she hates computers. If the Macintosh gets that kind of reaction out of her, this is powerful.” Miller is blunt about what happened next: “We copied it exactly.” When they brought their MacOS clone to the Fall 1984 Comdex, Steve Jobs expressed nothing but approbation. “You did a great job!” he said. No one from Apple seemed the slightest bit concerned at this stage about the resemblance to the Macintosh, and GEM hit store shelves the following spring as by far the most elegant and usable MS-DOS GUI yet.

A few months later, though, Apple started singing a very different tune. In the summer of 1985, they sent a legal threat to Digital Research which included a detailed list of all the ways that they believed GEM infringed on their MacOS copyrights. Having neither the stomach nor the cash for an extended court battle and fearing a preliminary injunction which might force them to withdraw GEM from the market entirely, Digital Research caved without a fight. They signed an agreement to replace the current version of GEM with a new one by November 15, doing away with such distinctive and allegedly copyright-protected Macintosh attributes as “the trash-can icon, the disk icons, and the close-window button in the upper-left-hand corner of a window.” They also agreed to an “undisclosed monetary settlement,” and to “provide programming services to Apple at a reduced rate.”

Any chance GEM might have had to break through the crowded field of MS-DOS GUIs was undone by these events. Most of the third-party developers Digital Research so desperately needed were unnerved by the episode, abandoning any plans they might have hatched to make native GEM applications. And so GEM, despite being vastly more usable than the contemporaneous Microsoft Windows even in its somewhat bowdlerized post-agreement form, would go on to become just another also-ran in the GUI race. [1]Reworked to run under a 68000 architecture, GEM would enjoy some degree of sustained success in another realm: not as an MS-DOS-hosted GUI but as the GUI hosted in the Atari ST’s ROM. In this form, it would survive well into the 1990s.

For the industry at large, the GEM smackdown was most significant as a sign of changing power structures inside Apple — changes which carried with them a new determination that others shouldn’t be allowed to rip off all of the Mac’s innovations. The former Pepsi marketing manager John Sculley was in the ascendant at Apple by the summer of 1985, Steve Jobs already being eased out the door. The former had been taught by the Cola Wars that a product’s secret formula was everything, and had to be protected at all costs. And the Macintosh’s secret formula was its beautiful interface; without it, it was just an overpriced chunk of workmanlike hardware — a bad joke when set next to a better, cheaper Motorola 68000-based computer like the new Commodore Amiga. The complaint against Digital Research was a warning shot to an industry that Sculley believed had gotten far too casual about throwing around phrases like “Mac-like.” “Apple is going after everybody,” warned one fearful software executive to the press. The relationship between Microsoft and Apple in particular was about to get a whole lot more complicated.

Said relationship had been a generally good one during the years when Steve Jobs was calling many of Apple’s shots. Jobs and Bill Gates, dramatically divergent in countless ways but equally ambitious, shared a certain esprit de corp born of having been a part of the microcomputer industry since before there was a microcomputer industry. Jobs genuinely appreciated his counterpart’s refusal to frame business computing as a zero-sum game between the Macintosh and the MS-DOS standard, even when provoked by agitprop like Apple’s famous “1984” Super Bowl advertisement. Instead Gates, contrary to his established popular reputation as the ultimate zero-sum business warrior, supported Apple’s efforts as well as IBM’s with real enthusiasm: signing up to produce Macintosh software two full years before the finished Mac was released, standing at Jobs’s side when Apple made major announcements, coming to trade shows conspicuously sporting a Macintosh tee-shirt. All indications are that the two truly liked and respected one another. For all that Apple and Microsoft through much of these two men’s long careers would be cast as the yin and yang of personal computing — two religions engaged in the most righteous of holy wars — they would have surprisingly few negative words to say about one another personally down through the years.

But when Steve Jobs decided or was forced to submit his resignation letter to Apple on September 17, 1985, trouble for Microsoft was bound to follow. John Sculley, the man now charged with cleaning up the mess Jobs had supposedly made of the Macintosh, enjoyed nothing like the same camaraderie with Bill Gates. He and his management team were openly suspicious of Microsoft, whose Windows was already circulating widely in beta form. Gates and others at Microsoft had gone on the record repeatedly saying they intended for Windows and the Macintosh to be sufficiently similar that they and other software developers would be able to port applications in short order between the two. Few prospects could have sounded less appealing to Sculley. Apple, whose products then as now enjoyed the highest profit margins in the industry thanks to their allure as computing’s hippest luxury brand, could see their whole business model undone by the appearance of cheap commodity clones that had been transformed by the addition of Windows into Mac-alikes. Of course, one look at Windows as it actually existed in 1985 could have disabused Sculley of the notion that it was likely to win any converts among people who had so much as glanced at MacOS. Still, he wasn’t happy about the idea of the Macintosh losing its status, now or in the future, as the only GUI environment that could serve as a true, comprehensive solution to all of one’s computing needs. So, within weeks of Jobs’s departure, feeling his oats after having so thoroughly cowed Digital Research, he threatened to sue Microsoft as well for copying the “look and feel” of the Macintosh in Windows.

He really ought to have thought things through a bit more before doing so. Threatening Bill Gates was always a dangerous game to play, and it was sheer folly when Gates had the upper hand, as he largely did now. Apple was at their lowest ebb of the 1980s when they tried to tell Microsoft that Windows would have to be cancelled or radically redesigned to excise any and all similarities to the Macintosh. Sales of the Mac had fallen to some 20,000 units per month, about one-fifth of Apple’s pre-launch projections for this point. The stream of early adopters with sufficient disposable income to afford the pricey gadget had ebbed away, and other potential buyers had started asking what you could really do with a Macintosh that justified paying two or three times as much for it as for an equivalent MS-DOS-based computer. Aldus PageMaker, the first desktop-publishing package for the Mac, had been released the previous summer, and would eventually go down in history as the product that, when combined with the Apple LaserWriter printer, saved the platform by providing a usage scenario that ugly old MS-DOS clearly, obviously couldn’t duplicate. But the desktop-publishing revolution would take time to show its full import. In the meantime, Apple was hard-pressed, and needed Microsoft — one of the few major publishers of business software actively supporting the Mac — far too badly to go around issuing threats to them.

Gates responded to Sculley’s threat with several of his own. If Sculley followed through with a lawsuit, Gates said, he’d stop all work at Microsoft on applications for the Macintosh and withdraw those that were already on store shelves, treating business computing henceforward as exactly the zero-sum game which he had never believed it to be in the past. This was a particularly potent threat in light of Microsoft’s new Excel spreadsheet, which had just been released to rave reviews and already looked likely to join PageMaker as the leading light among the second generation of Mac applications. In light of the machine’s marketplace travails, Apple was in no position to toss aside a sales driver like that one, the first piece of everyday Mac business software that was not just as good as but in many ways quite clearly better than equivalent offerings for MS-DOS. Yet Gates wouldn’t stop there. He would also, he said, refuse to renew Apple’s license to use Microsoft’s BASIC on their Apple II line of computers. This was a serious threat indeed, given that the aged Apple II line was the only thing keeping Apple as a whole afloat as the newer, sexier Macintosh foundered. Duly chastised, Apple backed down quickly — whereupon Gates, smelling blood in the water, pressed his advantage relentlessly, determined to see what else he could get out of finishing the fight Sculley had so foolishly begun.

One ongoing source of frustration between the two companies, dating back well into the days of Steve Jobs’s power and glory, was the version of BASIC for the Mac which Microsoft had made available for purchase on the day the machine first shipped. In the eyes of Apple and most of their customers, the mere fact of its existence on a platform that wasn’t replete with accessible programming environments was its only virtue. In practice, it didn’t work all that differently from Microsoft’s Apple II BASIC, offering almost no access to the very things which made the Macintosh the Macintosh, like menus, windows, and dialogs. A second release a year later had improved matters somewhat, but nowhere near enough in most people’s view. So, Apple had started work on a BASIC of their own, to be called simply MacBASIC, to supersede Microsoft’s. Microsoft BASIC for the Macintosh was hardly a major pillar of his company’s finances, but Bill Gates was nevertheless bothered inordinately by the prospect of it being cast aside. “Essentially, since Microsoft started their company with BASIC, they felt proprietary towards it,” speculates Andy Hertzfeld, one of the most important of the Macintosh software engineers. “They felt threatened by Apple’s BASIC, which was a considerably better implementation than theirs.” Gates said that Apple would have to kill their own version of BASIC and — just to add salt to the wound — sign over the name “MacBASIC” to Microsoft if they wished to retain the latter’s services as a Mac application developer and retain Microsoft BASIC on the Apple II.

And that wasn’t even the worst form taken by Gates’s escalation. Apple would also have to sign what amounted to a surrender document, granting Microsoft the right to create “derivative works of the visual displays generated by Apple’s Lisa and Macintosh graphic-user-interface programs.” The specific “derivative works” covered by the agreement were the user interfaces already found in Microsoft Windows for MS-DOS and five Microsoft applications for the Macintosh, including Word and Excel. The agreement provided Microsoft with nothing less than a “non-exclusive, worldwide, royalty-free, perpetual, non-transferable license to use those derivative works in present and future software programs, and to license them to and through third parties for use in their software programs.” In return, Microsoft would promise only to support Word and Excel on the Mac until October 1, 1986 — something they would certainly have done anyway. Gates was making another of those deviously brilliant tactical moves that were already establishing his reputation as the computer industry’s most infamous villain. Rather than denying that a “visual display” could fall under the domain of copyright, as many might have been tempted to do, he would rather affirm the possibility while getting Apple to grant Microsoft an explicit exception to being bound by it. Thus Apple — or, for that matter, Microsoft — could continue to sue MacOS’s — and potentially Windows’s — competitors out of existence while Windows trundled on unmolested.

Sculley called together his management team to discuss what to do about this Apple threat against Microsoft that had suddenly boomeranged into a Microsoft threat against Apple. Most at the meeting insisted that Gates had to be bluffing, that he would never cut off several extant revenue streams just to spite Apple and support this long-overdue Windows product of his which had been an industry laughingstock for so long. But Sculley wasn’t sure; he kept coming back to the fact that Microsoft could undoubtedly survive without Apple, but Apple might not be able to survive without Microsoft — at least not right now, given the Mac’s current travails. “I’m not ready to bloody the company,” he said, and signed the surrender document two days after Windows 1.01 first appeared in its boxed form at the Fall 1985 Comdex show’s Microsoft Roast. His tone toward Gates now verged on pleading: “What I’m really asking for, Bill, is a good relationship. I’m glad to give you the rights to this stuff.”

After the full scale of what John Sculley had given away to Bill Gates became clear, Apple fans started drawing pointed comparisons between Sculley and Neville Chamberlain. As it happened, Sculley’s version of “peace for our time” would last scarcely longer than Chamberlain’s. And as for Gates… well, plenty of Apple fans would indeed soon be calling him the Adolf Hitler of the computer industry in the midst of plenty of other overheated rhetoric.

Bill Gates wrote a jubilant email to eleven colleagues at Microsoft’s partner companies, saying that he had “received a release from Apple for any possible copyright, trade-secret, or patent issue relating to our products, including Windows.” The people at Apple were less jubilant. “Everyone was somewhat disgusted over [the agreement],” remembers Donn Denman, the chief programmer of Apple’s much superior but shitcanned MacBASIC. Sculley could only say to him and his colleagues that “it was the right decision for the company. It was a business decision.” They didn’t find him very convincing. The bad feelings engendered by the agreement would never entirely go away, and the relationship between Apple and Microsoft would never be quite the same again — not even when Excel became one of the prime drivers of something of a Macintosh Renaissance in the following year.

We jump forward now to March 17, 1988, by which time the industry had changed considerably. Microsoft was still entangled with IBM in the development of OS/2 and its Presentation Manager, but was also continuing to push Windows, which had come out in a substantially revised version 2 some six months earlier. The Macintosh, meanwhile, had carved out a reasonable niche for itself as a tool for publishers and creative professionals of various stripes, even as the larger world of business-focused personal computing continued to run on MS-DOS.

Sitting in his office that day, Bill Gates agreed to take a call from a prominent technology journalist, who asked him if he had a comment to make about the new lawsuit from Apple against Microsoft. “Lawsuit? What lawsuit?” Gates asked. He had just met with Sculley the day before to discuss Microsoft’s latest Mac applications. “He never mentioned it to me. Not one word,” said Gates to the reporter on the other end of the line.

Sculley, it seemed, had decided not to risk losing his nerve again. Apple had gone straight to filing their lawsuit in court, without giving Microsoft so much as a warning, much less a chance to negotiate a remedy. [2]Apple sued Hewlett-Packard at the same time, for an application called NewWave which ran on top of Windows and provided many of the Mac-like features, such as icons representing programs and disks and a desktop workspace, which Windows 2 alone still lacked. But that lawsuit would always remain a sideshow in comparison to the main event to whose fate its own must inevitably be tied. So, in the interest of that aforementioned clarity and concision, we won’t concern ourselves with it here. It appeared that the latest version of Microsoft’s GUI environment for MS-DOS, which with its free-dragging and overlapping windows hewed much closer to the Macintosh than its predecessor, had both scared and enraged Sculley to such an extent that he had judged this declaration of war to be his only option. “Windows 2 is an unconscionable ripoff of MacOS,” claimed Apple. They demanded $50,000 per infringement per unit of Windows sold — adding up to a downright laughable total of $4.5 billion by their current best estimate — and the “impoundment and destruction” of all extant or future copies of Windows. Microsoft replied that Apple had signed over the rights to the Mac’s “visual displays” for use in Windows in 1985, and, even if they hadn’t, such things weren’t really copyrightable anyway.

So, who had the right of this thing? As one might expect, the answer to that question is far more nuanced than the arguments which either side would present in court. Writing years after the lawsuit had passed into history but repeating the arguments he had once made in court, Tandy Trower, the Windows project leader at Microsoft from 1985 to 1988, stated that “the allegation clearly had no merit as I had never intended to copy the Macintosh interface, was never given any directive to do that, and never directed my team to do that. The similarities between the two products were largely due to the fact that both Windows and Macintosh had common ancestors, that being many of the earlier windowing systems, such as those like Alto and Star that were created at Xerox PARC.” This is, to put it bluntly, nonsense. To deny the massive influence of the Macintosh on Windows is well-nigh absurd — although, I should be careful to say, I have no reason to believe that Trower makes his absurd argument out of anything but ignorance here. By the time he arrived on the Windows team, Apple’s implementation of the GUI had already been so thoroughly internalized by the industry in general that the huge strides it had made over the Xerox PARC model were being forgotten, and the profoundly incorrect and unfair meme that Apple had simply copied Xerox’s work and called it a day was already taking hold.

The people at Xerox PARC had indeed originated the idea of the GUI, but, as playing with a Xerox Alto emulator will quickly reveal, hadn’t been able to take it anywhere close to the Macintosh’s place of elegant, intuitive usability. By the time the Xerox GUI made its one appearance as a commercial product, in the form of the Xerox Star office system, it had actually regressed in at least one way even as it progressed in many others: overlapping windows, which had been possible in Xerox PARC’s Smalltalk environment, were not allowed on the Star. Tellingly, the aspect of Windows 1 which attracted the most derision back in the day, and which still makes it appear so hapless today, is a similar rigid system of tiled windows. (The presence of this retrograde-seeming element was largely thanks to Scott MacGregor, who arrived at Microsoft to guide the Windows project after having been one of the major architects of the Star.) Meanwhile, as I also noted in my little tour of Windows 1 in a previous article, many of those aspects of it which do manage to feel natural and intuitive today — such as the drop-down menus — are those that work more like the Macintosh than anything developed at Xerox PARC. In light of this reality, Microsoft’s GUI would only hew closer to the Mac model as time went on, for the incontrovertible reason that the Mac model was just better for getting real stuff done in the real world.

And there are plenty of other disconcerting points of similarity between early versions of MacOS and early versions of Windows. Right from the beginning, Windows 1 shipped with a suite of applets — a calculator, a “control panel” for system settings, a text editor that went by the name of “notepad,” etc. — that were strikingly similar to those included in MacOS. Further, if other members of the Windows team itself are to be believed, Microsoft’s Neil Konzen, a programmer intimately familiar with the Macintosh, duplicated some of MacOS’s internal structures so closely as to introduce some of the same bugs. In short, to believe that the Macintosh wasn’t the most important influence on the Windows user interface by far, given not only the similarities in the finished product but the knowledge that Microsoft had been working daily with the evolving Macintosh since January of 1982, is to actively deny reality out of either ignorance or some ulterior motive.

Which isn’t to say that Microsoft’s designers had no ideas of their own. In fact, some of those ideas are still in place in current versions of Windows. To take perhaps the most immediately obvious example, Windows then and now places its drop-down menus at the top of the windows themselves, while the Macintosh has a menu bar at the top of the screen which changes to reflect the currently selected window. [3]Both Microsoft and Apple have collected reams of data which they claim prove that their approach is the best one. I do suspect, however, that the original impetus can be found in the fact that MacOS was originally a single-tasking operating system, meaning that only one menu bar would need to be available at any one time. Windows, on the other hand, was designed as a multitasking environment from the start. And Microsoft’s embrace of the two-button mouse, contrasted with Apple’s stubborn loyalty to the one-button version of same, has sparked constant debate for decades. Still, differing details like these should be seen as exactly that in light of all the larger-scale similarities.

And yet just acknowledging that Windows was, shall we say, strongly influenced by MacOS hardly got to the bottom of the 1988 case. There was still the matter of that November 1985 agreement, which Microsoft was now waving in the face of anyone in the legal or journalistic professions who would look at it. The bone of contention between the two companies here was whether the “visual displays” of Windows 2 as well as Windows 1 were covered by the agreement. Microsoft naturally contended that they were; Apple contended that the Windows 2 interface had changed so much in comparison to its predecessor — coming to resemble the Macintosh even more in the process — that it could no longer be considered one of the specific “derivative works” to which Apple had granted Microsoft a license.

We’ll return to the court’s view of this question shortly. For now, though, let’s give Apple the benefit of the doubt as we continue to explore the full ramifications of their charges against Microsoft. The fact was that if one accepted Apple’s contention that Windows 2 wasn’t covered by the agreement, the questions surrounding the case grew more rather than less momentous. Could and should one be able to copyright the “look and feel” of a user interface, as opposed to the actual code used to create it? In pressing their claim, Apple was relying on an amorphous, under-explicated area of copyright law known as “visual copyright.”

In terms of computer software, the question of the bounds of visual copyright had been most thoroughly explored in the context of videogames. Back in 1980, Midway, a major producer of standup-arcade games, had sued a much smaller company called Dirkschneider for producing a clone of their popular game Galaxian. The judge in that case ruled in favor of Midway, formulating a new legal standard called the “Ten-foot Rule”: “If a reasonable person could not, at ten feet, tell the difference between two competitive products, then there was cause to believe an infringement was occurring.” Atari, the biggest videogame producer of all, then proceeded to use this precedent to pressure dozens of companies into withdrawing their clones of Atari games — in arcades, on game consoles, and on computers — from the market.

Somewhat later, in 1985, Brøderbund Software sued Kyocera for bundling with their printers an application called Printmaster, a thinly veiled clone of Brøderbund’s own hugely popular Print Shop package for making signs, greeting cards, and banners. They won their case the following year, with Judge William H. Orrick stating that Brøderbund’s copyright did indeed cover “the overall appearance, structure, and sequence” of screens in their software, and that Kyocera had thus infringed on same. Brøderbund’s Gary Carlston called the ruling “historic”: “If we don’t have copyright protection for our products, then it is going to be significantly more difficult to maintain a competitive advantage.” Encouraged by this ruling, in 1987 a maker of telecommunications software called Digital Communications Associates sued a company called Softklone Corporation — their name certainly didn’t help their cause — for copying the status display of their terminal software, and won their case as well. The Ten-Foot Rule, it seemed, could be successfully applied to software other than games. Both of these cases were cited by Apple’s lawyers in their own suit against Microsoft.

Brøderbund’s Print Shop side-by-side with Kyocera’s Printmaster.

Yet the Ten-Foot Rule, at least when applied to general-purpose software rather than games, struck many as deeply problematic. One of the most important advantages of a GUI was the way it made diverse types of software from diverse developers work and look the same, thereby keeping the user from having to relearn how to do the same basic tasks over and over again. What sort of chaos would follow if people started suing each other willy-nilly over this much-needed uniformity? And what did the Ten-Foot Rule mean for the many GUI environments, on MS-DOS and other platforms, that looked so similar to one another and to MacOS? That, of course, was the real crux of the matter for Microsoft and Apple as they faced one another in court.

The debate over the Ten-Foot Rule  and its potential ramifications wasn’t actually a new one, having already been taken up in public by the software industry before Apple filed their lawsuit. Fully thirteen months before that momentous day, Larry Tesler, an Apple executive, clashed heatedly with Bill Gates over this very issue at a technology conference. Tesler insisted that there was no problem inherent in applying the Ten-Foot Rule to operating systems and operating environments: “When someone comes up with a very good and popular look and feel, as we’ve done with the Macintosh, then they can make that available by licensing [it] to other people.”

But Gates was having none of this:

There’s no control of look and feel. I don’t know anybody who has asserted that things like drop-down menus and dialog boxes and just those general form-type aspects are subject to this look-and-feel stuff. Certainly it’s our view that the consistency of the user interface has become a very spreading thing, and that it’s open, generic technology. All of these approaches — how you click on [menu] bars, and certainly all those user-interface techniques and windows — there’s absolutely no restriction in any way on how people use those.

He thus ironically argued against the very premise of the 1985 agreement between Apple and Microsoft — that Apple had created a “visual display” subject to copyright protection, to which they were now granting Microsoft a license for certain products. But then, Gates seldom let deeply-held philosophical beliefs interfere with his pursuit of short-term advantage. In this latest debate as well, Gates’s arguments were undoubtedly self-serving, but they were no less valid in this case for being so. The danger he could point to if this sort of thing should spread was that of every innovative new application seeking copyright protection not just for its code but for the very ideas that made it up. Because form — i.e., look and feel — ideally followed function in software engineering. What would have happened if VisiCorp had been able to copyright the look and feel of the first spreadsheet? (VisiCalc and Lotus 1-2-3 looked pretty much identical from ten feet.) If WordStar had been able to copyright the look and feel of the word processor? If, to choose a truly absurd example, the first individual to devise a command-line interface back in the mists of time had been able to copyright that? It wasn’t at all clear where the lines could be drawn once the law started down this slippery slope. If Apple owned the set of ideas and approaches that everyone now thought of as the GUI in general, where did that leave the rest of the industry?

For this reason, Apple’s lawsuit, when it came, was greeted with deep concern even by many of those who weren’t particularly friendly with Microsoft. “Although Apple has a right to protect the results of its development and marketing efforts,” said the respected Silicon Valley pundit Larry Magid, “it should not try to thwart the obvious direction of the industry.” “If Apple is trying to push this as far as they appear to be trying to push it,” said Dan Bricklin of VisiCalc fame, “this is a sad day for the software industry in America.” More surprisingly, MacOS architect Andy Hertzfeld said that “in general, it’s a horrible thing. Apple could really end up hurting itself.” Most surprisingly of all, even Steve Jobs, now running a new company called NeXT, found Apple’s arguments as dangerous as they were unconvincing: “When we were developing the Macintosh, we kept in mind a famous quote of Picasso: ‘Good artists copy, great artists steal.’ What do I think of the suit? I personally don’t understand it. Can I copyright gravity? No.”

Interestingly, the lawyers pressing the lawsuit on Apple’s behalf didn’t ask for a preliminary injunction that would have forced Microsoft to withdraw Windows from the market. Some legal watchers interpreted this fact as a sign that they themselves weren’t certain about the real merits of their case, and hoped to win it as much through bluster as finely-honed legal arguments. Ditto Apple’s request that the eventual trial be decided by a jury of ordinary people who might be prone to weigh the case based on everyday standards of “fairness,” rather than by a judge who would be well-versed in the niceties of the law and the full ramifications of a verdict against Microsoft.

At this point, and especially given those ramifications, one feels compelled to ask just why Apple chose at this juncture to embark on such a lengthy, expensive, and fraught enterprise as a lawsuit against the company that remained the most important single provider of serious business software for the Macintosh, a platform whose cup still wasn’t exactly running over with such things. By way of an answer, we should consider that John Sculley was as proud a man as most people who rise to his elevated status in business tend to be. The belief, widespread both inside and outside of Apple, that he had let Bill Gates bully, outsmart, and finally rob him blind back in 1985 had to rankle him badly. In addition, Apple in general had long nursed a grievance, unproductive but understandable, against all the outsiders who had copied the interface they had worked so long and hard to perfect; thus those threatened lawsuits against Digital Research and Microsoft all the way back in 1985. A wiser leader might have told his employees to take their competitors’ imperfect copying as proof of Apple’s superiority, might have exhorted them to look toward their next big innovation rather than litigate their innovations of the past. But, at least on March 17, 1988, John Sculley wasn’t that wiser leader. Thus this lawsuit, dangerous not just to Apple and Microsoft but to their entire industry.

Bill Gates, for his part, remained more accustomed to bullying than being bullied. It had been spelled out for him right there in the court filing that a loss to Apple would almost certainly mean the end of Windows, the operating environment which was quite possibly the key to Microsoft’s future. Even widespread fear of such an event, he realized, could be devastating to Windows’s — and thus to Microsoft’s — prospects. So, he struck back fiercely so as to leave no doubt where he stood. Microsoft filed a counter-suit in April of 1988, accusing Apple of breaking the 1985 agreement and of filing their own lawsuit in bad faith, in the hope of creating fear, uncertainty, and doubt around Windows and thus “wrongfully inhibiting” its commercial future. Adding weight to their argument that the original lawsuit was a form of business competition by other means was the fact that Apple was being oddly selective in choosing whom to sue over the alleged copyright violations. Asked why they weren’t going after other products just as similar to MacOS as Windows, such as IBM’s forthcoming OS/2 Presentation Manager, Apple refused to comment.

The first skirmishes took place in the press rather than a courtroom: Sculley accusing Gates of having tricked him into signing the 1985 agreement, Gates saying a contract was a contract, and what sort of a chief executive let himself be tricked anyway? The exchanges just kept getting uglier from there. The technology journalists, naturally, loved every minute of it, while the software industry was thrown into a tizzy, wondering what this would mean for Windows just as it finally seemed to be gaining some traction. Phillipe Kahn, CEO of Borland, described the situation in colorful if non-politically-correct language: it was like “waking up and finding your partner might have AIDS.”

The court case marched forward much more slowly than the tabloid war of words. Gates stated in a sworn deposition that “from a user’s perspective, the visual displays which appear in Windows 2 are virtually identical to those which appear in Windows 1.” “This assertion,” Apple replied, “is contradicted by even the most casual observation of the two products.” On March 18, 1989, Judge William Schwarzer of the Federal District Court in San Francisco marked the one-year anniversary of the case by ruling against Microsoft on this issue, stating that only those attributes of Windows 2 which had also existed in Windows 1 were covered by the 1985 agreement. This meant most notably that the newer GUI’s system of overlapping windows stood outside the boundaries of that document, and thus that, as the judge put it, the 1985 agreement alone “was not a complete defense” for Microsoft. It did not, he ruled, give Microsoft the right “to develop future versions of Windows as it pleases. What Microsoft received was a license to use the visual displays in the named software products as they appeared to the user in November 1985. The displays [of Windows 1 and Windows 2] are fundamentally different.” Microsoft’s stock price promptly plummeted by 27 percent. It was an undeniable setback. “Microsoft’s major defense has been shot down,” crowed Apple’s attorneys.

“Major” was perhaps not the right choice of adjectives, but certainly Microsoft’s simplest possible form of defense had proved insufficient to bail them out. It seemed that total victory could be achieved now only by invalidating the whole notion of visual copyright which underlay both the 1985 agreement and Apple’s lawsuit based on its violation. That meant a long, tough slog at best. And with Windows 3 — the version that Microsoft was convinced would finally be the breakthrough version — getting closer and closer to release and looking more and more like the Macintosh all the while, the stakes were higher than ever.

The question of look and feel and visual copyright as applied to software had implications transcending even the fate of Windows or either company. If Apple’s suit succeeded, it would transform the software business overnight, making it extremely difficult to borrow or build on the ideas of others in the way that software had always done in the past. Bill Gates was an avid student of business history. As was his wont, he now looked back to compare his current plight with that of an earlier titan of industry. Back in 1903, just as the Ford Motor Company was getting off the ground, Henry Ford had been hit with a lawsuit from a group of inventors claiming to own a patent on the very concept of the automobile. He had battled them for years, vowing to fight on even after losing in open court in 1909: “There will be no let-up in the legal fight,” he declared on that dark day. At last, in 1911, he won the case on appeal — winning it not only for Ford Motor Company but for the future of the automobile industry as a field of open competition. His own legal war had similar stakes, Gates believed, and he and Microsoft intended to prosecute it in equally stalwart fashion — to win it not just for themselves but for the future of the software industry. This was necessary, he wrote in a memo, “to help set the boundaries of where copyrights should and should not be applied. We will prevail.”

(Sources: the books The Making of Microsoft: How Bill Gates and His Team Created the World’s Most Successful Software Company by Daniel Ichbiah and Susan L. Knepper, Hard Drive: Bill Gates and the Making of the Microsoft Empire by James Wallace and Jim Erickson, Gates: How Microsoft’s Mogul Reinvented an Industry and Made Himself the Richest Man in America by Stephen Manes and Paul Andrews, and Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Company by Owen W. Linzmayer; Wall Street Journal of September 25 1987; Creative Computing of May 1985; InfoWorld of October 7 1985 and October 20 1986; MacWorld of October 1993; New York Times of March 18 1988 and March 18 1989.)

Footnotes

Footnotes
1 Reworked to run under a 68000 architecture, GEM would enjoy some degree of sustained success in another realm: not as an MS-DOS-hosted GUI but as the GUI hosted in the Atari ST’s ROM. In this form, it would survive well into the 1990s.
2 Apple sued Hewlett-Packard at the same time, for an application called NewWave which ran on top of Windows and provided many of the Mac-like features, such as icons representing programs and disks and a desktop workspace, which Windows 2 alone still lacked. But that lawsuit would always remain a sideshow in comparison to the main event to whose fate its own must inevitably be tied. So, in the interest of that aforementioned clarity and concision, we won’t concern ourselves with it here.
3 Both Microsoft and Apple have collected reams of data which they claim prove that their approach is the best one. I do suspect, however, that the original impetus can be found in the fact that MacOS was originally a single-tasking operating system, meaning that only one menu bar would need to be available at any one time. Windows, on the other hand, was designed as a multitasking environment from the start.
 

Tags: , , , , ,

The Manhole

The Manhole

Because the CD-ROM version of The Manhole sold in relatively small numbers in comparison to the original floppy version, the late Russell Lieblich’s surprisingly varied original soundtrack is too seldom heard today. So, in the best tradition of multimedia computing (still a very new and sexy idea in the time about which I’m writing), feel free to listen while you read.

The Manhole



Were HyperCard “merely” the essential bridge between Ted Nelson’s Xanadu fantasy and the modern World Wide Web, it would stand as one of the most important pieces of software of the 1980s. But, improbably, HyperCard was even more than that. It’s easy to get so dazzled by its early implementation of hypertext that one loses track entirely of the other part of Bill Atkinson’s vision for the environment. True to the Macintosh, “the computer for the rest of us,” Atkinson designed HyperCard as a sort of computerized erector set for everyday users who might not care a whit about hypertext for its own sake. With HyperCard, he hoped, “a whole new body of people who have creative ideas but aren’t programmers will be able to express their ideas or expertise in certain subjects.”

He made good on that goal. An incredibly diverse group of people worked with HyperCard, a group in which traditional hackers were very much the minority. Danny Goodman, the man who became known as the world’s foremost authority on HyperCard programming, was actually a journalist whose earlier experiences with programming had been limited to a few dabblings in BASIC. In my earlier article about hypertext and HyperCard, I wrote how “a professor of music converted his entire Music Appreciation 101 course into a stack.” Well, readers, I meant that literally. He did it himself. Industry analyst and HyperCard zealot Jan Lewis:

You can do things with it [HyperCard] immediately. And you can do sexy things: graphics, animation, sound. You can do it without knowing how to program. You get immediate feedback; you can make a change and see or hear it immediately. And as you go up on the learning curve — let’s say you learn how to use HyperTalk [the bundled scripting language] — again, you can make changes easily and simply and get immediate feedback. It just feels good. It’s fun!

And yet HyperCard most definitely wasn’t a toy. People could and did make great, innovative, commercial-quality software using it. Nowhere is the power of HyperCard — a cultural as well as a technical power — illustrated more plainly than in the early careers of Rand and Robyn Miller.

The Manhole

Rand and Robyn had a very unusual upbringing. The first and third of the four sons of a wandering non-denominational preacher, they spent their childhoods moving wherever their father’s calling took him: from Dallas to Albuquerque, from Hawaii to Haiti to Spokane. They were a classic pairing of left brain and right brain. Rand had taken to computers from the instant he was introduced to them via a big time-shared system whilst still in junior high, and had made programming them into his career. By 1987, the year HyperCard dropped, he was to all appearances settled in life: 28 years old, married with children, living in a small town in East Texas, working for a bank as a programmer, and nurturing a love for the Apple Macintosh (he’d purchased his first Mac within days of the machine’s release back in 1984). He liked to read books on science. His brother Robyn, seven years his junior, was still trying to figure out what to do with his life. He was attending the University of Washington in somewhat desultory fashion as an alleged anthropology major, but devoted most of his energy to drawing pictures and playing the guitar. He liked to read adventure novels.

HyperCard struck Rand Miller, as it did so many, with all the force of a revelation. While he was an accomplished enough programmer to make a living at it, he wasn’t one who particularly enjoyed the detail work that went with the trade. “There are a lot of people who love digging down into the esoterics of compilers and C++, getting down and dirty with typed variables and all that stuff,” he says. “I wanted a quick return on investment. I just wanted to get things done.” HyperCard offered the chance to “get things done” dramatically faster and more easily than any programming environment he had ever seen. He became an immediate convert.

The Manhole

With two small girls of his own, Rand felt keenly the lack of quality children’s software for the Macintosh. He hit upon the idea of making a sort of interactive storybook using HyperCard, a very natural application for a hypertext tool. Lacking the artistic talent to make a go of the pictures, he thought of his little brother Robyn. The two men, so far apart in years and geography and living such different lives, weren’t really all that close. Nevertheless, Rand had a premonition that Robyn would be the perfect partner for his interactive storybook.

But Robyn, who had never owned a computer and had never had any interest in doing so, wasn’t immediately enticed by the idea of becoming a software developer. Getting him just to consider the idea took quite a number of letters and phone calls. At last, however, Robyn made his way down to the Macintosh his parents kept in the basement of the family home in Spokane and loaded up the copy of HyperCard his brother had sent him. There, like so many others, he was seduced by Bill Atkinson’s creation. He started playing around, just to see what he could make. What he made right away became something very different from the interactive storybook, complete with text and metaphorical pages, that Rand had envisioned. Robyn:

I started drawing this picture of a manhole — I don’t even know why. You clicked on it and the manhole cover would slide off. Then I made an animation of a vine growing out. The vine was huge, “Jack and the Beanstalk”-style. And then I didn’t want to turn the page. I wanted to be able to navigate up the vine, or go down into the manhole. I started creating a navigable world by using the very simple tools [of HyperCard]. I created this place.  I improvised my way through this world, creating one thing after another. Pretty soon I was creating little canals, and a forest with stars. I was inventing it as I went. And that’s how the world was born.

For his part, Rand had no problem accepting the change in approach:

Immediately you are enticed to explore instead of turning the page. Nobody sees a hole in the ground leading downward and a vine growing upward and in the distance a fire hydrant that says, “Touch me,” and wants to turn the page. You want to see what those things are. Instead of drawing the next page [when the player clicked a hotspot], he [Robyn] drew a picture that was closer — down in the manhole or above on the vine. It was kind of a stream of consciousness, but it became a place instead of a book. He started sending me these images, and I started connecting them, trying to make them work, make them interactive.

The Manhole

In this fashion, they built the world of The Manhole together: Robyn pulling its elements from the flotsam and jetsam of his consciousness and drawing them on the screen, Rand binding it all together into a contiguous place, and adding sound effects and voice snippets here and there. If they had tried to make a real game of the thing, with puzzles and goals, such a non-designed approach to design would likely have gone badly wrong in a hurry.

Luckily, puzzles and goals were never the point of The Manhole. It was intended always as just an endlessly interesting space to explore. As such, it would prove capable of captivating children and the proverbial young at heart for hours, full as it was of secrets and Easter eggs hidden in the craziest of places. One can play with The Manhole on and off for literally years, and still continue to stumble upon the occasional new thing. Interactions are often unexpected, and unexpectedly delightful. Hop in a rowboat to take a little ride and you might emerge in a rabbit’s teacup. Start watching a dragon’s television — Why does a dragon have a television? Who knows! — and you can teleport yourself into the image shown on the screen to emerge at the top of the world. Search long enough, and you might just discover a working piano you can actually play. The spirit of the thing is perhaps best conveyed by the five books you find inside the friendly rabbit’s home: Alice in Wonderland; The Wind in the Willows; The Lion, the Witch, and the Wardrobe; Winnie the Pooh; and Metaphors of Intercultural Philosophy (“This book isn’t about anything!”). Like all of those books excepting, presumably, the last, The Manhole is pretty wonderful, a perfect blend of sweet cuteness and tart whimsy.

The Manhole

With no contacts whatsoever within the Macintosh software industry, the brothers decided to publish The Manhole themselves via a tiny advertisement in the back of Macworld magazine, taken out under the auspices of Prolog, a consulting company Rand had founded as a moonlighting venture some time before. They rented a tiny booth to show The Manhole publicly for the first time at the Hyper Expo in San Francisco in June of 1988. (Yes, HyperCard mania had gotten so intense that there were entire trade shows dedicated just to it.) There they were delighted to receive a visit from none other than HyperCard’s creator Bill Atkinson, with his daughter Laura in tow; not yet five years old, she had no trouble navigating through their little world. Incredibly, Robyn had never even heard the word “hypertext” prior to the show, had no idea about the decades of theory that underpinned the program he had used, savant-like, to create The Manhole. When he met a band of Ted Nelson’s disgruntled Xanadu disciples on the show floor, come to crash the HyperCard party, he had no idea what they were on about.

But the brothers’ most important Hyper Expo encounter was a meeting with Richard Lehrberg, Vice President for Product Development at Mediagenic, [1]Activision was renamed Mediagenic at almost the very instant that Lehrberg first met the Miller brothers. When the name change was greeted with universal derision, Activision/Mediagenic CEO Bruce Davis quickly began backpedaling on his hasty decision. The Manhole, for instance, was released by Mediagenic under their “Activision” label — which was odd because under the new ordering said label was supposed to be reserved for games, and The Manhole was considered children’s software, not a traditional game. I just stick with the name “Mediagenic” in this article as the least confusing way to address a confusing situation. who took a copy of The Manhole away with him for evaluation. Lehrberg showed it to William Volk, whom he had just hired away from the small Macintosh and Amiga publisher Aegis to become Mediagenic’s head of technology; he described it to Volk unenthusiastically as “this little HyperCard thing” done by “two guys in Texas.” Volk was much more impressed. He was immediately intrigued by one aspect of The Manhole in particular: the way that it used no buttons or conventional user-interface elements at all. Instead, the pictures themselves were the interface; you could just click where you would and see what happened. It was perhaps a product of Robyn Miller’s sheer naïveté as much as anything else; seasoned computer people, so used to conventional interface paradigms, just didn’t think like that. But regardless of where it came from, Volk thought it was genius, a breaking down of a wall that had heretofore always separated the user from the virtual world. Volk:

The Miller brothers had come up with what I call the invisible interface. They had gotten rid of the idea of navigation buttons, which was what everyone was doing: go forward, go backward, turn right, turn left. They had made the scenes themselves the interface. You’re looking at a fire hydrant. You click on the fire hydrant; the fire hydrant sprays water. You click on the fire hydrant again; you zoom in to the fire hydrant, and there’s a little door on the fire hydrant. That was completely new.

Of course, other games did have you clicking “into” their world to make things happen; the point-and-click adventure genre was evolving rapidly during this period to replace the older parser-driven adventure games. But even games like Déjà Vu and Maniac Mansion, brilliantly innovative though they were, still surrounded their windows into their worlds with a clutter of “verb” buttons, legacies of the genre’s parser-driven roots. The Manhole, however, presented the player with nothing but its world. What with its defiantly non-Euclidean — not to say nonsensical — representation of space and its lack of goals and puzzles, The Manhole wasn’t a conventional adventure game by any stretch. Nevertheless, it pointed the way to what the genre would become, not least in the later works of the Miller brothers themselves.

Much of Volk’s working life for the next two years would be spent on The Manhole, by the end of which period he would quite possibly be more familiar with its many nooks and crannies than its own creators were. He became The Manhole‘s champion inside Mediagenic, convincing his colleagues to publish it, thereby bringing it to a far wider audience than the Miller brothers could ever have reached on their own. Released by Mediagenic under their Activision imprint, it became a hit by the modest standards of the Macintosh consumer-software market. Macworld magazine named The Manhole the winner of their “Wild Card” category in a feature article on the best HyperCard stacks, while the Software Publishers Association gave it an “Excellence in Software” award for “Best New Use of a Computer.”

We aware that The Manhole was collecting a certain computer-chic cachet, Mediagenic/Activision didn't hesitate to play that angle up in their advertising.

Well aware that The Manhole was collecting a certain chic cachet to itself, Mediagenic/Activision didn’t hesitate to play that angle up in their advertising.

Had that been left to be that, The Manhole would remain historically interesting as both a delightful little curiosity of its era and as the starting point of the hugely significant game-development careers of the Miller brothers. Yet there’s more to the story.

William Volk, frustrated with the endless delays of CD-I and the state of paralysis the entire industry was in when it came to the idea of publishing entertainment software on CD, had been looking for some time for a way to break the logjam. It was Stewart Alsop, an influential tech journalist, who first suggested to Volk that the answer to his dilemma was already part of Mediagenic’s catalog — that The Manhole would be perfect for CD-ROM. Volk was just the person to see such a project through, having already experimented extensively with CD-ROM and CD-I  as part of Aegis as well as Mediagenic. With the permission of the Miller brothers, he recruited Russell Lieblich, Mediagenic’s longstanding guru in all things music- and sound-related, to compose and perform a soundtrack for The Manhole which would play from the CD as the player explored.

An important difference separates the way the music worked in the CD-ROM version of The Manhole from the way it worked in virtually all computer games to appear before it. The occasional brief digitized snippet aside, music in computer games had always been generated on the computer, whether by sound chips like the Commodore 64’s famous SID or entire sound boards like the top-of-its-class Roland MT-32 (we shall endeavor to forget the horrid beeps and squawks that issued from the IBM PC and Apple II’s native sound hardware). But The Manhole‘s music, while having been originally generated entirely or almost entirely on computers in Lieblich’s studio, was then recorded onto CD for digital playback, just like a song on a music CD. This method, made possible only by evolving computer sound hardware and, most importantly, by the huge storage capacity of a CD-ROM, would in the years to come slowly become simply the way that computer-game music was done. Today many big-budget titles hire entire orchestras to record soundtracks as elaborate and ambitious as the ones found in big Hollywood feature films, whilst also including digitized recordings of voices, squealing tires, explosions, and all the inevitable rest. In fact, surprisingly little of the sound present in most modern games is synthesized sound, a situation that has long since relegated elaborate setups like the Roland MT-32 to the status of white elephants; just pipe your digitized recording through a digital-to-analog converter and be done with it already.

As the very first title to go all digitized all the time, The Manhole didn’t have a particularly easy time of it; getting the music to play without breaking up or stuttering as the player explored presented a huge challenge on the Macintosh, a machine whose minimalist design burdened the CPU with all of the work of sound generation. However, Volk and his colleagues got it going at last. Published in the spring of 1989, the CD-ROM version of The Manhole marked a major landmark in the history of computing, the first American game — or, at least, software toy (another big buzzword of the age, as it happens) — to be released on CD-ROM. [2]The first CD-based software to reach European consumers says worlds about the differences that persisted between American and European computing, and about the sheer can-do ingenuity that so often allowed British programmers in particular to squeeze every last ounce of potential out of hardware that was usually significantly inferior to that enjoyed by their American counterparts. Codemasters, a budget software house based in Warwickshire, came up with a very unique shovelware package for the 1989 Christmas season. They transferred thirty old games from cassette to a conventional audio CD, which they then sold along with a special cable to run the output from an ordinary music-CD player into a Sinclair or Amstrad home computer. “Here’s your CD-ROM,” they said. “Have a ball.” By all accounts, Codemasters’s self-proclaimed “CD revolution,” kind of hilarious and kind of brilliant, did quite well for them. When it came to doing more with less in computing, you never could beat the Brits. Volk, infuriated with Philips for the chaos and confusion CD-I’s endless delays had wrought in an industry he believed was crying out for the limitless vistas of optical storage, sent them a copy of The Manhole along with a curt note: “See! We did it! We’re tired of waiting!”

And they weren’t done yet. Having gotten The Manhole working on CD-ROM on the Macintosh, Volk and his colleagues at Mediagenic next tackled the daunting task of porting it to the most popular platform for consumer software, MS-DOS — a platform without HyperCard. To address this lack, Mediagenic developed a custom engine for CD-ROM titles on MS-DOS, dubbing it the Multimedia Applications Development Environment, or MADE. [3]MADE’s scripting language was to some extent based on AdvSys, a language for amateur text-adventure creation that never quite took off like the contemporaneous AGT. Mediagenic’s in-house team of artists redrew Robyn Miller’s original black-and-white illustrations in color, and The Manhole on CD-ROM for MS-DOS shipped in 1990.

In my opinion, The Manhole lost a little bit of its charm when it was colorized. The VGA graphics, impressive in their day, look a bit garish today.

In my opinion, The Manhole lost some of its unique charm when it was colorized for MS-DOS. The VGA graphics, impressive in their day, look just a bit garish and overdone today in comparison to the classic pen-and-ink style of the original.

The Manhole, idiosyncratic piece of artsy children’s software that it was, could hardly have been expected to break the industry’s optical logjam all on its own. Its CD-ROM incarnation, for that matter, wasn’t all that hugely different from the floppy version. In the end, one has to acknowledge that The Manhole on CD-ROM was little more than the floppy version with a soundtrack playing in the background — a nice addition certainly, but perhaps not quite the transformative experience which all of the rhetoric surrounding CD-ROM’s potential might have led one to expect. It would take another few excruciating years for a CD-ROM drive to become a must-have accessory for everyday American computers. Yet every revolution has to start somewhere, and William Volk deserves his full measure of credit for doing what he could to push this one forward in the only way that could ultimately matter: by stepping up and delivering a real, tangible product at long last. As Steve Jobs used to say, “Real artists ship.”

The importance of The Manhole, existing as it does right there at the locus of so much that was new and important in computing in the late 1980s, can be read in so many ways that there’s always a danger of losing some of them in the shuffle. But it should never be forgotten whilst trying to sort through the tangle that this astonishingly creative little world was principally designed by someone who had barely touched a computer in his life before he sat down with HyperCard. That he wound up with something so fascinating is a huge tribute not just to Robyn Miller and his enabling brother Rand, but also to Bill Atkinson’s HyperCard itself. Apple has long since abandoned HyperCard, and we enjoy no precise equivalent to it today. Indeed, its vision of intuitive, non-pretentious, fun programming is one that we’re in danger of losing altogether. Being one who loves the computer most of all as the most exciting tool for creation ever invented, I can’t help but see that as a horrible shame.

The Miller brothers had, as most of you reading this probably know, a far longer future in front of them than HyperCard would get to enjoy. Already well before 1988 was through they had rechristened themselves Cyan Productions, a name that felt much more appropriate for a creative development house than the businesslike Prolog. As Cyan, they made two more pieces of children’s software, Cosmic Osmo and the Worlds Beyond the Makerei and Spelunx and the Caves of Mr. Seudo. Both were once again made using HyperCard, and both were very much made in the spirit of The Manhole. And like The Manhole both were published on CD-ROM as well as floppy disk; the Miller brothers, having learned much from Mediagenic’s process of moving their first title to CD-ROM, handled the CD-ROM as well as the floppy versions themselves when it came to these later efforts. Opinions are somewhat divided on whether the two later Cyan children’s titles fully recapture the magic that has led so many adults and children alike over the years to spend so much time plumbing the depths of The Manhole. None, however, can argue with the significance of what came next, the Miller brothers’ graduation to games for adults — and, as it happens, another huge milestone in the slow-motion CD-ROM revolution. But that story, like so many others, is one that we’ll have to tell at another time.

(Sources: Amstrad Action of January 1990; Macworld of July 1988, October 1988, November 1988, March 1989, April 1989, and December 1989; Wired of August 1994 and October 1999; The New York Times of November 28 1989. Also the books Myst and Riven: The World of the D’ni by Mark J.P. Wolf and Prima’s Official Strategy Guide: Myst by Rick Barba and Rusel DeMaria, and the Computer Chronicles television episodes entitled “HyperCard,” “MacWorld Special 1988,” “HyperCard Update,” and “Hypertext.” Online sources include Robyn Miller’s Myst postmortem from the 2013 Game Developer’s Conference; Richard Moss’s Ludiphilia podcast; a blog post by Robyn Miller. Finally, my huge thanks to William Volk for sharing his memories and impressions with me in an interview and for sending me an original copy of The Manhole on CD-ROM for my research.

The original floppy-disk-based version of The Manhole can be played online at archive.org. The Manhole: Masterpiece Edition, a remake supervised by the Miller brothers in 1994 which sports much-improved graphics and sound, is available for purchase on Steam.)

Footnotes

Footnotes
1 Activision was renamed Mediagenic at almost the very instant that Lehrberg first met the Miller brothers. When the name change was greeted with universal derision, Activision/Mediagenic CEO Bruce Davis quickly began backpedaling on his hasty decision. The Manhole, for instance, was released by Mediagenic under their “Activision” label — which was odd because under the new ordering said label was supposed to be reserved for games, and The Manhole was considered children’s software, not a traditional game. I just stick with the name “Mediagenic” in this article as the least confusing way to address a confusing situation.
2 The first CD-based software to reach European consumers says worlds about the differences that persisted between American and European computing, and about the sheer can-do ingenuity that so often allowed British programmers in particular to squeeze every last ounce of potential out of hardware that was usually significantly inferior to that enjoyed by their American counterparts. Codemasters, a budget software house based in Warwickshire, came up with a very unique shovelware package for the 1989 Christmas season. They transferred thirty old games from cassette to a conventional audio CD, which they then sold along with a special cable to run the output from an ordinary music-CD player into a Sinclair or Amstrad home computer. “Here’s your CD-ROM,” they said. “Have a ball.” By all accounts, Codemasters’s self-proclaimed “CD revolution,” kind of hilarious and kind of brilliant, did quite well for them. When it came to doing more with less in computing, you never could beat the Brits.
3 MADE’s scripting language was to some extent based on AdvSys, a language for amateur text-adventure creation that never quite took off like the contemporaneous AGT.
 
 

Tags: , ,

The Freedom to Associate

In 1854, an Austrian priest and physics teacher named Gregor Mendel sought and received permission from his abbot to plant a two-acre garden of pea plants on the grounds of the monastery at which he lived. Over the course of the next seven years, he bred together thousands upon thousands of the plants under carefully controlled circumstances, recording in a journal the appearance of every single offspring that resulted, as defined by seven characteristics: plant height, pod shape and color, seed shape and color, and flower position and color. In the end, he collected enough data to formulate the basis of the modern science of genetics, in the form of a theory of dominant and recessive traits passed down in pairs from generation to generation. He presented his paper on the subject, “Experiments on Plant Hybridization,” before the Natural History Society of Austria in 1865, and saw it published in a poorly circulated scientific journal the following year.

And then came… nothing. For various reasons — perhaps due partly to the paper’s unassuming title, perhaps due partly to the fact that Mendel was hardly a known figure in the world of biology, undoubtedly due largely to the poor circulation of the journal in which it was published — few noticed it at all, and those who did dismissed it seemingly without grasping its import. Most notably, Charles Darwin, whose On the Origin of Species had been published while Mendel was in the midst of his own experiments, seems never to have been aware of the paper at all, thereby missing this key gear in the mechanism of evolution. Mendel was promoted to abbot of his monastery shortly after the publication of his paper, and the increased responsibilities of his new post ended his career as a scientist. He died in 1884, remembered as a quiet man of religion who had for a time been a gentleman dabbler in the science of botany.

But then, at the turn of the century, the German botanist Carl Correns stumbled upon Mendel’s work while conducting his own investigations into floral genetics, becoming in the process the first to grasp its true significance. To his huge credit, he advanced Mendel’s name as the real originator of the set of theories which he, along with one or two other scientists working independently, was beginning to rediscover. Correns effectively shamed those other scientists as well into acknowledging that Mendel had figured it all out decades before any of them even came close. It was truly a selfless act; today the name of Carl Correns is unknown except in esoteric scientific circles, while Gregor Mendel’s has been done the ultimate honor of becoming an adjective (“Mendelian”) and a noun (“Mendelism”) locatable in any good dictionary.

Vannevar Bush

Vannevar Bush

So, all’s well that ends well, right? Well, maybe, but maybe not. Some 30 years after the rediscovery of Mendel’s work, an American named Vannevar Bush, dean of MIT’s School of Engineering, came to see the 35 years that had passed between the publication of Mendel’s theory and the affirmation of its importance as a troubling symptom of the modern condition. Once upon a time, all knowledge had been regarded as of a piece, and it had been possible for a great mind to hold within itself huge swathes of this collective knowledge of humanity, everything informing everything else. Think of that classic example of a Renaissance man, Leonardo da Vinci, who was simultaneously a musician, a physicist, a mathematician, an anatomist, a botanist, a geologist, a cartographer, an alchemist, an astronomer, an engineer, and an inventor. Most of all, of course, he was a great visual artist, but he used everything else he was carrying around in that giant brain of his to create paintings and drawings as technically meticulous as they were artistically sublime.

By Bush’s time, however, the world had long since entered the Age of the Specialist. As the sheer quantity of information in every field exploded, those who wished to do worthwhile work in any given field — even those people gifted with giant brains — were increasingly being forced to dedicate their intellectual lives entirely to that field and only that field, just to keep up. The intellectual elite were in danger of becoming a race of mole people, closeted one-dimensionals fixated always on the details of their ever more specialized trades, never on the bigger picture. And even then, the amount of information surrounding them was so vast, and existing systems for indexing and keeping track of it all so feeble, that they could miss really important stuff within their own specialties; witness the way the biologists of the late nineteenth century had missed Gregor Mendel’s work, and the 35-years head start it had cost the new science of genetics. “Mendel’s work was lost,” Bush would later write, “because of the crudity with which information is transmitted between men.” How many other major scientific advances were lying lost in the flood of articles being published every year, a flood that had increased by an order of magnitude just since Mendel’s time? “In this are thoughts,” wrote Bush, “certainly not often as great as Mendel’s, but important to our progress. Many of them become lost; many others are repeated over and over.” “This sort of catastrophe is undoubtedly being repeated all around us,” he believed, “as truly significant attainments become lost in the sea of the inconsequential.”

Bush’s musings were swept aside for a time by the rush of historical events. As the prospect of another world war loomed, he became President Franklin Delano Roosevelt’s foremost advisor on matters involving science and engineering. During the war, he shepherded through countless major advances in the technologies of attack and defense, culminating in the most fearsome weapon the world had ever known: the atomic bomb. It was actually this last that caused Bush to return to the seemingly unrelated topic of information management, a problem he now saw in a more urgent light than ever. Clearly the world was entering a new era, one with far less tolerance for the human folly, born of so much context-less mole-person ideology, that had spawned the current war.

Practical man that he was, Bush decided there was nothing for it but to roll up his sleeves and make a concrete proposal describing how humanity could solve the needle-in-a-haystack problem of the modern information explosion. Doing so must entail grappling with something as fundamental as “how creative men think, and what can be done to help them think. It is a problem of how the great mass of material shall be handled so that the individual can draw from it what he needs — instantly, correctly, and with utter freedom.”

As revolutionary manifestos go, Vannevar Bush’s “As We May Think” is very unusual in terms of both the man that wrote it and the audience that read it. Bush was no Karl Marx, toiling away in discontented obscurity and poverty. On the contrary, he was a wealthy upper-class patrician who was, as a member of the White House inner circle, about as fabulously well-connected as it was possible for a man to be. His article appeared first in the July 1945 edition of the Atlantic Monthly, hardly a bastion of radical thought. Soon after, it was republished in somewhat abridged form by Life, the most popular magazine on the planet. Thereby did this visionary document reach literally millions of readers.

With the atomic bomb still a state secret, Bush couldn’t refer directly to his real reasons for wanting so urgently to write down his ideas now. Yet the dawning of the atomic age nevertheless haunts his article.

It is the physicists who have been thrown most violently off stride, who have left academic pursuits for the making of strange destructive gadgets, who have had to devise new methods for their unanticipated assignments. They have done their part on the devices that made it possible to turn back the enemy, have worked in combined effort with the physicists of our allies. They have felt within themselves the stir of achievement. They have been part of a great team. Now, as peace approaches, one asks where they will find objectives worthy of their best.

Seen in one light, Bush’s essay is similar to many of those that would follow from other Manhattan Project alumni during the uncertain interstitial period between the end of World War II and the onset of the Cold War. Bush was like many of his colleagues in feeling the need to advance a utopian agenda to counter the apocalyptic potential of the weapon they had wrought, in needing to see the ultimate evil that was the atomic bomb in almost paradoxical terms as a potential force for good that would finally shake the world awake.

Bush was true to his engineer’s heart, however, in basing his utopian vision on technology rather than politics. The world was drowning in information, making the act of information synthesis — intradisciplinary and interdisciplinary alike — ever more difficult.

The difficulty seems to be, not so much that we publish unduly in view of the extent and variety of present-day interests, but rather that publication has been extended far beyond our present ability to make real use of the record. The summation of human experience is being expanded at a prodigious rate, and the means we use for threading through the consequent maze to the momentarily important item is the same as was used in the days of square-rigged ships.

Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. It can be in only one place, unless duplicates are used; one has to have rules as to which path will locate it, and the rules are cumbersome. Having found one item, moreover, one has to emerge from the system and reenter on a new path.

The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. It has other characteristics, of course; trails that are not frequently followed are prone to fade, items are not fully permanent, memory is transitory. Yet the speed of action, the intricacy of trails, the detail of mental pictures, is awe-inspiring beyond all else in nature.

Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve it, for his records have relative permanency. The first idea, however, to be drawn from the analogy concerns selection. Selection by association, rather than indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage.

Bush was not among the vanishingly small number of people who were working in the nascent field of digital computing in 1945. His “memex,” the invention he proposed to let an individual free-associate all of the information in her personal library, was more steampunk than cyberpunk, all whirring gears, snickering levers, and whooshing microfilm strips. But really, those things are just details; he got all of the important stuff right. I want to quote some more from “As We May Think,” and somewhat at length at that, because… well, because its vision of the future is just that important. This is how the memex should work:

When the user is building a trail, he names it, inserts the name in his code book, and taps it out on his keyboard. Before him are the two items to be joined, projected onto adjacent viewing positions. At the bottom of each there are a number of blank code spaces, and a pointer is set to indicate one of these on each item. The user taps a single key, and the items are permanently joined. In each code space appears the code word. Out of view, but also in the code space, is inserted a set of dots for photocell viewing; and on each item these dots by their positions designate the index number of the other item.

Thereafter, at any time, when one of these items is in view, the other can be instantly recalled merely by tapping a button below the corresponding code space. Moreover, when numerous items have been thus joined together to form a trail, they can be reviewed in turn, rapidly or slowly, by deflecting a lever like that used for turning the pages of a book. It is exactly as though the physical items had been gathered together from widely separated sources and bound together to form a new book. It is more than this, for any item can be joined into numerous trails.

The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constants. He inserts a page of longhand analysis of his own. Thus he builds a trail of his interest through the maze of materials available to him.

And his trails do not fade. Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outraged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own memex, there to be linked into the more general trail.

Wholly new forms of encyclopedias will appear, ready-made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified. The lawyer has at his touch the associated opinions and decisions of his whole experience, and of the experience of friends and authorities. The patent attorney has on call the millions of issued patents, with familiar trails to every point of his client’s interest. The physician, puzzled by a patient’s reactions, strikes the trail established in studying an earlier similar case, and runs rapidly through analogous case histories, with side references to the classics for the pertinent anatomy and histology. The chemist, struggling with the synthesis of an organic compound, has all the chemical literature before him in his laboratory, with trails following the analogies of compounds, and side trails to their physical and chemical behavior.

The historian, with a vast chronological account of a people, parallels it with a skip trail which stops only on the salient items, and can follow at any time contemporary trails which lead him all over civilization at a particular epoch. There is a new profession of trail blazers, those who find delight in the task of establishing useful trails through the enormous mass of the common record. The inheritance from the master becomes, not only his additions to the world’s record, but for his disciples the entire scaffolding by which they were erected.

Ted Nelson

Ted Nelson

There is no record of what all those millions of Atlantic Monthly and Life readers made of Bush’s ideas in 1945 — or for that matter if they made anything of them at all. In the decades that followed, however, the article became a touchstone of the burgeoning semi-underground world of creative computing. Among its discoverers was Ted Nelson, who is depending on whom you talk to either one of the greatest visionaries in the history of computing or one of the greatest crackpots — or, quite possibly, both. Born in 1937 to a Hollywood director and his actress wife, then raised by his wealthy and indulgent grandparents following the inevitable Hollywood divorce, Nelson’s life would largely be defined by, as Gary Wolf put it in his classic profile for Wired magazine, his “aversion to finishing.” As in, finishing anything at all, or just the concept of finishing in the abstract. Well into middle-age, he would be diagnosed with attention-deficit disorder, an alleged malady he came to celebrate as his “hummingbird mind.” This condition perhaps explains why he was so eager to find a way of forging permanent, retraceable associations among all the information floating around inside and outside his brain.

Nelson coined the terms “hypertext” and “hypermedia” at some point during the early 1960s, when he was a graduate student at Harvard. (Typically, he got a score of Incomplete in the course for which he invented them, not to mention an Incomplete on his PhD as a whole.) While they’re widely used all but interchangeably today, in Nelson’s original formulation the former term was reserved for purely textual works, the later for those incorporating others forms of media, like images and sound. But today we’ll just go with the modern flow, call them all hypertexts, and leave it at that. In his scheme, then, hypertexts were texts capable of being “zipped” together with other hypertexts, memex-like, wherever the reader or writer wished to preserve associations between them. He presented his new buzzwords to the world at a conference of the Association for Computing Machinery in 1965, to little impact. Nelson, possessed of a loudly declamatory style of discourse and all the rabble-rousing fervor of a street-corner anarchist, would never be taken all that seriously by the academic establishment.

Instead, it being the 1960s and all, he went underground, embracing computing’s burgeoning counterculture. His eventual testament, one of the few things he ever did manage to complete — after a fashion, at any rate — was a massive 1200-page tome called Computer Lib/Dream Machines, self-published in 1974, just in time for the heyday of the Altair and the Homebrew Computer Club, whose members embraced Nelson as something of a patron saint. As the name would indicate, Computer Lib/Dream Machines was actually two separate books, bound back to back. Theoretically, Computer Lib was the more grounded volume, full of practical advice about gaining access to and using computers, while Dream Machines was full of the really out-there ideas. In practice, though, they were often hard to distinguish. Indeed, it was hard to even find anything in the books, which were published as mimeographed facsimile copies filled with jotted marginalia and cartoons drafted in Nelson’s shaky hand, with no table of contents or page numbers and no discernible organizing principle beyond the stream of consciousness of Nelson’s hummingbird mind. (I trust that the irony of a book concerned with finding new organizing principles for information itself being such an impenetrable morass is too obvious to be worth belaboring further.) Nelson followed Computer Lib/Dream Machines with 1981’s Literary Machines, a text written in a similar style that dwelt, when it could be bothered, at even greater length on the idea of hypertext.

The most consistently central theme of Nelson’s books, to whatever extent one could be discerned, was an elaboration of the hypertext concept he called Xanadu, after the pleasure palace in Samuel Taylor Coleridge’s poem “Kubla Khan.” The product of an opium-fueled hallucination, the 54-line poem is a mere fragment of a much longer work Coleridge had intended to write. Problem was, in the course of writing down the first part of his waking dream he was interrupted; by the time he returned to his desk he had simply forgotten the rest.

So, Nelson’s Xanadu was intended to preserve information that would otherwise be lost, which goal it would achieve through associative linking on a global scale. Beyond that, it was almost impossible to say precisely what Xanadu was or wasn’t. Certainly it sounds much like the World Wide Web to modern ears, but Nelson insists adamantly that the web is a mere bad implementation of the merest shadow of his full idea. Xanadu has been under allegedly active development since the late 1960s, making it the most long-lived single project in the history of computer programming, and by far history’s most legendary piece of vaporware. As of this writing, the sum total of all those years of work are a set of web pages written in Nelson’s inimitable declamatory style, littered with angry screeds against the World Wide Web, along with some online samples that either don’t work quite right or are simply too paradigm-shattering for my poor mind to grasp.

In my own years on this planet, I’ve come to reserve my greatest respect for people who finish things, a judgment which perhaps makes me less than the ideal critic of Ted Nelson’s work. Nevertheless, even I can recognize that Nelson deserves huge credit for transporting Bush’s ideas to their natural habitat of digital computers, for inventing the term “hypertext,” for defining an approach to links (or “zips”) in a digital space, and, last but far from least, for making the crucial leap from Vannevar Bush’s concept of the single-user memex machine to an interconnected global network of hyperlinks.

But of course ideas, of which both Bush and Nelson had so many, are not finished implementations. During the 1960s, 1970s, and early 1980s, there were various efforts — in addition, that is, to the quixotic effort that was Xanadu — to wrestle at least some of the concepts put forward by these two visionaries into concrete existence. Yet it wouldn’t be until 1987 that a corporation with real financial resources and real commercial savvy would at last place a reasonably complete implementation of hypertext before the public. And it all started with a frustrated programmer looking for a project.

Steve Jobs and Bill Atkinson

Steve Jobs and Bill Atkinson

Had he never had anything to do with hypertext, Bill Atkinson’s place in the history of computing would still be assured. Coming to Apple Computer in 1978, when the company was only about eighteen months removed from that famous Cupertino garage, Atkinson was instrumental in convincing Steve Jobs to visit the Xerox Palo Alto Research Center, thereby setting in motion the chain of events that would lead to the Macintosh. A brilliant programmer by anybody’s measure, he eventually wound up on the Lisa team. He wrote the routines to draw pixels onto the Lisa’s screen — routines on which, what with the Lisa being a fundamentally graphical machine whose every display was bitmapped, every other program depended. Jobs was so impressed by Atkinson’s work on what he named LisaGraf that he recruited him to port his routines over to the nascent Macintosh. Atkinson’s routines, now dubbed QuickDraw, would remain at the core of MacOS for the next fifteen years. But Atkinson’s contribution to the Mac went yet further: after QuickDraw, he proceeded to design and program MacPaint, one of the two applications included with the finished machine, and one that’s still justifiably regarded as a little marvel of intuitive user-interface design.

Atkinson’s work on the Mac was so essential to the machine’s success that shortly after its release he became just the fourth person to be named an Apple Fellow — an honor that carried with it, implicitly if not explicitly, a degree of autonomy for the recipient in the choosing of future projects. The first project that Atkinson chose for himself was something he called the Magic Slate, based on a gadget called the Dynabook that had been proposed years ago by Xerox PARC alum (and Atkinson’s fellow Apple Fellow) Alan Kay: a small, thin, inexpensive handheld computer controlled via a touch screen. It was, as anyone who has ever seen an iPhone or iPad will attest, a prescient project indeed, but also one that simply wasn’t realizable using mid-1980s computer technology. Having been convinced of this at last by his skeptical managers after some months of flailing,  Atkinson wondered if he might not be able to create the next best thing in the form of a sort of software version of the Magic Slate, running on the Macintosh desktop.

In a way, the Magic Slate had always had as much to do with the ideas of Bush and Nelson as it did with those of Kay. Atkinson had envisioned its interface as a network of “pages” which the user navigated among by tapping links therein — a hypertext in its own right. Now he transported the same concept to the Macintosh desktop, whilst making his metaphorical pages into metaphorical stacks of index cards. He called the end result, the product of many months of design and programming, “Wildcard.” Later, when the trademark “Wildcard” proved to be tied up by another company, it turned into “HyperCard” — a much better name anyway in my book.

By the time he had HyperCard in some sort of reasonably usable shape, Atkinson was all but convinced that he would have to either sell the thing to some outside software publisher or start his own company to market it. With Steve Jobs now long gone and with him much of the old Jobsian spirit of changing the world through better computing, Apple was heavily focused on turning the Macintosh into a practical business machine. The new, more sober mood in Cupertino — not to mention Apple’s more buttoned-down public image — would seem to indicate that they were hardly up for another wide-eyed “revolutionary” product. It was Alan Kay, still kicking around Cupertino puttering with this and that, who convinced Atkinson to give CEO John Sculley a chance before he took HyperCard elsewhere. Kay brokered a meeting between Sculley and Atkinson, in which the latter was able to personally demonstrate to the former what he’d been working on all these months. Much to Atkinson’s surprise, Sculley loved HyperCard. Apparently at least some of the old Jobsian fervor was still alive and well after all inside Apple’s executive suite.

At its most basic, a HyperCard stack to modern eyes resembles nothing so much as a PowerPoint presentation, albeit one which can be navigated non-linearly by tapping links on the slides themselves. Just as in PowerPoint, the HyperCard designer could drag and drop various forms of media onto a card. Taken even at this fairly superficial level, HyperCard was already a full-fledged hypertext-authoring (and hypertext-reading) tool — by no means the first specimen of its kind, but the first with the requisite combination of friendliness, practicality, and attractiveness to make it an appealing environment for the everyday computer user. One of Atkinson’s favorite early demo stacks had many cards with pictures of people wearing hats. If you clicked on a hat, you were sent to another card showing someone else wearing a hat. Ditto for other articles of fashion. It may sound banal, but this really was revolutionary, organization by association in action. Indeed, one might say that HyperCard was Vannevar Bush’s memex, fully realized at last.

But the system showed itself to have much, much more to offer when the author started to dig into HyperTalk, the included scripting language. All sorts of logic, simple or complex, could be accomplished by linking scripts to clicks on the surface of the cards. At this level, HyperCard became an almost magical tool for some types of game development, as we’ll see in future articles. It was also a natural fit for many other applications: information kiosks, interactive tutorials, educational software, expert systems, reference libraries, etc.

HyperCard in action

HyperCard in action

John Sculley himself premiered HyperCard at the August 1987 MacWorld show. Showing unusual largess in his determination to get HyperCard into the hands of as many people as possible as quickly as possible, he announced that henceforward all new Macs would ship with a free copy of the system, while existing owners could buy copies for their machines for just $49. He called HyperCard the most important product Apple had released during his tenure there. Considering that Sculley had also been present for the launch of the original Macintosh, this was certainly saying something. And yet he wasn’t clearly in the wrong either. As important as the Macintosh, the realization in practical commercial form of the computer-interface paradigms pioneered at Xerox PARC during the 1970s, has been to our digital lives of today, the concept of associative indexing — hyperlinking — has proved at least as significant. But then, the two do go together like strawberries and cream, the point-and-click paradigm providing the perfect way to intuitively navigate through a labyrinth of hyperlinks. It was no coincidence that an enjoyable implementation of hypertext appeared first on the Macintosh; the latter almost seemed a prerequisite for the former.

The full revolutionary nature of the concept of hypertext was far from easy to get across in advertising copy, but Apple gave it a surprisingly good go, paying due homage to Vannevar Bush in the process.

The full import of the concept of hypertext was far from easy to get across in advertising copy, but Apple gave it a surprisingly serious go, paying due homage to Vannevar Bush in the process.

In the wake of that MacWorld presentation, a towering tide of HyperCard hype rolled from one side of the computer industry to the other, out into the mainstream media, and then back again, over and over. Hypertext’s time had finally come. In 1985, it was an esoteric fringe concept known only to academics and a handful of hackers, being treated at real length and depth in print only in Ted Nelson’s own sprawling, well-nigh impenetrable tomes. Four years later, every bookstore in the land sported a shelf positively groaning with trendy paperbacks advertising hypertext this and hypertext that. By then the curmudgeons had also begun to come out in force, always a sure sign that an idea has truly reached critical mass. Presentations showed up in conference catalogs with snarky titles like “Hypertext: Will It Cook Me Breakfast Too?.”

The curmudgeons had plenty of rabid enthusiasm to push back against. HyperCard, even more so than the Macintosh itself, had a way of turning the most sober-minded computing veterans into starry-eyed fanatics. Jan Lewis, a long time business-computing analyst, declared that “HyperCard is going to revolutionize the way computing is done, and possibly the way human thought is done.” Throwing caution to the wind, she abandoned her post at InfoWorld to found HyperAge, the first magazine dedicated to the revolution. “There’s a tremendous demand,” she said. “If you look at the online services, the bulletin boards, the various ad hoc meetings, user groups — there is literally a HyperCulture developing, almost a cult.” To judge from her own impassioned statements, she should know. She recruited Ted Nelson himself — one of the HyperCard holy trinity of Bush, Nelson, and Atkinson — to write a monthly column.

HyperCard effectively amounted to an entirely new computing platform that just happened to run atop the older platform that was the Macintosh. As Lewis noted, user-created HyperCard stacks — this new platform’s word for “programs” or “software” — were soon being traded all over the telecommunications networks. The first commercial publisher to jump into the HyperCard game was, somewhat surprisingly, Mediagenic. [1]Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article. Bruce Davis, Mediagenic’s CEO, has hardly gone down into history as a paradigm of progressive thought in the realms of computer games and software in general, but he defied his modern reputation in this one area at least by pushing quickly and aggressively into “stackware.” One of the first examples of same that Mediagenic published was Focal Point, a collection of business and personal-productivity tools written by one Danny Goodman, who was soon to publish a massive bible called The Complete HyperCard Handbook, thus securing for himself the mantle of the new ecosystem’s go-to programming guru. Focal Point was a fine demonstration that just about any sort of software could be created by the sufficiently motivated HyperCard programmer. But it was another early Mediagenic release, City to City, that was more indicative of the system’s real potential. It was a travel guide to most major American cities — an effortlessly browsable and searchable guide to “the best food, lodgings, and other necessities” to be found in each of the metropolises in its database.

City to City

City to City

Other publishers — large, small, and just starting out — followed Mediagenic’s lead, releasing a bevy of fascinating products. The people behind The Whole Earth Catalog — themselves the inspiration for Ted Nelson’s efforts in self-publication — converted their current edition into a HyperCard stack filling a staggering 80 floppy disks. A tiny company called Voyager combined HyperCard with a laser-disc player — a very common combination among ambitious early HyperCard developers — to offer an interactive version of the National Gallery of Art which could be explored using such associative search terms as “Impressionist landscapes with boats.” Culture 1.0 let you explore its namesake through “3700 years of Western history — over 200 graphics, 2000 hypertext links, and 90 essays covering topics from the Black Plague to Impressionism,” all on just 7 floppy disks. Mission: The Moon, from the newly launched interactive arm of ABC News, gathered together details of every single Mercury, Gemini, and Apollo mission, including videos of each mission hosted on a companion laser disc. A professor of music converted his entire Music Appreciation 101 course into a stack. The American Heritage Dictionary appeared as stackware. And lots of what we might call “middlestackware” appeared to help budding programmers with their own creations: HyperComposer for writing music in HyperCard, Take One for adding animations to cards.

Just two factors were missing from HyperCard to allow hypertext to reach its full potential. One was a storage medium capable of holding lots of data, to allow for truly rich multimedia experiences, combining the lavish amounts of video, still pictures, music, sound, and of course text that the system clearly cried out for. Thankfully, that problem was about to be remedied via a new technology which we’ll be examining in my very next article.

The other problem was a little thornier, and would take a little longer to solve. For all its wonders, a HyperCard stack was still confined to the single Macintosh on which it ran; there was no provision for linking between stacks running on entirely separate computers. In other words, one might think of a HyperCard stack as equivalent to a single web site running locally off a single computer’s hard drive, without the ability to field external links alongside its internal links. Thus the really key component of Ted Nelson’s Xanadu dream, that of a networked hypertext environment potentially spanning the entire globe, remained unrealized. In 1990, Bill Nisen, the developer of a hypertext system called Guide that slightly predated HyperCard but wasn’t as practical or usable, stated the problem thus:

The one thing that is precluding the wide acceptance of hypertext and hypermedia is adequate broadcast mechanisms. We need to find ways in which we can broadcast the results of hypermedia authoring. We’re looking to in the future the ubiquitous availability of local-area networks and low-cost digital-transmission facilities. Once we can put the results of this authoring into the hands of more users, we’re going to see this industry really explode.

Already at the time Nisen made that statement, a British researcher named Tim Berners-Lee had started to experiment with something he called the Hypertext Transfer Protocol. The first real web site, the beginning of the World Wide Web, would go online in 1991. It would take a few more years even from that point, but a shared hypertextual space of a scope and scale the likes of which few could imagine was on the way. The world already had its memex in the form of HyperCard. Now — and although this equivalency would scandalize Ted Nelson — it was about to get its Xanadu.

Associative indexing permeates our lives so thoroughly today that, as with so many truly fundamental paradigm shifts, the full scope of the change it has wrought can be difficult to fully appreciate. A century ago, education was still largely an exercise in retention: names, dates, Latin verb cognates. Today’s educational institutions  — at least the more enlightened ones — recognize that it’s more important to teach their pupils how to think than it is to fill their heads with facts; facts, after all, are now cheap and easy to acquire when you need them. That such a revolution in the way we think about thought happened in just a couple of decades strikes me as incredible. That I happened to be present to witness it strikes me as amazing.

What I’ve witnessed has been a revolution in humanity’s relationship to information itself that’s every bit as significant as any political revolution in history. Some Singularity proponents will tell you that it marks the first step on the road to a vast worldwide consciousness. But even if you choose not to go that far, the ideas of Vannevar Bush and Ted Nelson are still with you every time you bring up Google. We live in a world in which much of the sum total of human knowledge is available over an electronic connection found in almost every modern home. This is wondrous. Yet what’s still more wondrous is the way that we can find almost any obscure fact, passage, opinion, or idea we like from within that mass, thanks to selection by association. Mama, we’re all cyborgs now.

(Sources: the books Hackers: Heroes of the Computer Revolution and Insanely Great: The Life and Times of the Macintosh, the Computer That Changed Everything by Steven Levy; Computer Lib/Dream Machines and Literary Machines by Ted Nelson; From Memex to Hypertext: Vannevar Bush and the Mind’s Machine, edited by James M. Nyce and Paul Kahn; The New Media Reader, edited by Noah Wardrip-Fruin and Nick Montfort; Multimedia and Hypertext: The Internet and Beyond by Jakob Nielsen; The Making of the Atomic Bomb by Richard Rhodes. Also the June 1995 Wired magazine profile of Ted Nelson; Andy Hertzfeld’s website Folklore; and the Computer Chronicles television episodes entitled “HyperCard,” “MacWorld Special 1988,” “HyperCard Update,” and “Hypertext.”)

Footnotes

Footnotes
1 Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article.
 
54 Comments

Posted by on September 23, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: ,