20 Feb

The Apple Macintosh had one hell of a long and winding road to join Steve Jobs onstage in front of a cheering throng at De Anza College’s Flint Auditorium on January 24, 1984. It was never even a particular priority of its parent company until, all other options being exhausted, it suddenly had to be. But once it finally was let out of its bag it became, just as its father predicted, the computer that changed everything.

Jobs wasn’t even the first father the Mac knew. It had originally been conceived almost five years earlier by another dreamer, digital utopianist, and early Apple employee named Jef Raskin who believed he could save the world — or at least make it a better place — if he could just build the Dynabook.

The brain child of still another dreamer and visionary named Alan Kay, who first began to write and speak of it in the very early days of Xerox PARC, the Dynabook was more thought experiment than realistic proposal — a conception, an aspirational vision of what could one day be. Kay called it “a dynamic media for creative thought”:

Imagine having your own self-contained knowledge manipulator in a portable package the size and shape of an ordinary notebook. Suppose it had enough power to outrace your senses of sight and hearing, enough capacity to store for later retrieval thousands of page-equivalents of reference materials, poems, letters, recipes, records, drawings, animations, musical scores, waveforms, dynamic simulations, and anything else you would like to remember and change.

The Dynabook was a tall order in light of the realities of 1970s computer technology. Indeed, nothing that came remotely close would actually appear for another two decades at least. As Kay himself once put it, thinkers generally fall into two categories: the da Vincis who sketch away like mad and spin out a dozen impractical ideas before breakfast upon which later generations can build careers and obsessions; and the Michelangelos who tackle huge but ultimately practical projects and get them done. Kay was a da Vinci to the bone. The PARC researchers dubbed the less fanciful workstation they built to be their primary engine of innovation for the time being, the Alto, the “interim Dynabook.”

Michael Scott, Steve Jobs, Jef Raskin, Chris Espinosa, and Steve Wozniak circa 1977

Michael Scott, Steve Jobs, Jef Raskin, Chris Espinosa, and Steve Wozniak circa 1977

Much later in the decade, Raskin thought he might advance the cause a bit more with an interim Dynabook of his own. He thought even the much-loved Apple II was too complicated, too difficult and fiddly, too aesthetically unpleasant, too big to ever play an important role in anyone’s life who was more interested in what she could do with a computer than the computer as an end in itself. He therefore pitched to the executives at Apple his idea for a relatively cheap (about $1000) and portable computer that, far from being the hardware hacker’s playground that was the Apple II, would be a sealed, finished piece — the only one you had to buy to start expressing yourself digitally. Even all the software you’d need would come built right in. Believing that the standard industry practice of naming prototypes after women (as often as not the prettiest secretary in the office) was sexist, he decided to call his idea Macintosh, after his favorite type of (edible) apples, the McIntosh.

In many ways Raskin’s idea cut directly against the grain of Apple’s corporate strategy, which was to further penetrate the business market, in the short term via the Apple III and in the long via the Lisa; both projects were already underway, although the latter was in nothing like the form it would eventually assume. While Apple was trying to trade in their bellbottoms for three-piece suits, Raskin was still living the hippie dream of bringing power to the people. “If I wanted to work for a business company, I’d join IBM,” he told Apple’s president Mike Scott. Still, the company was booming and an IPO was already visible on the horizon. There was enough money and enough hippie utopianism still hanging about the place to let Raskin and a few others tinker with his project.

The Macintosh project during its first eighteen months rarely had a staff of more than four, and often less than that; Raskin had to fight for scraps. Sometimes that worked out just as well; a key acquisition was Burrell Smith, a talented hardware engineer he rescued from a job as a lowly service technician, testing and repairing Apple IIs that had come back to the company under warranty. Smith became the Mac’s hardware guru, a position he would continue to hold right up through the launch and some time beyond, giving him by far the longest tenure of any member of the original team. Given his price window, Smith couldn’t afford to design anything that would be much more powerful than the Apple II; the first prototype was built around an 8-bit Motorola 6809 no more powerful than the Apple II’s 6502, and had just 64 K of memory. It did, however, use a relatively high-resolution bitmapped display in lieu of the Apple II’s text. Although he was oddly unenamored with mice and windows, this part at least of the Xerox PARC gospel had reached Raskin loud and clear.

With Raskin himself often not seeming sure what he wanted and what was doable and many of his staff not seeming overly interested in buckling down to work on his schemes, the project languished through most of 1980. On one or two occasions it was actually cancelled, only to be revived in response to Raskin’s impassioned pleas. Yet practical progress was hard to see. Raskin mostly busied himself with The Book of Macintosh, a sort of aspirational bible hardly more practical than Kay’s original dream of the Dynabook. Then Steve Jobs read The Book of Macintosh and promptly came in and took his computer away from him.

Jobs was a huge headache for Michael Scott, Mike Markkula, and the rest of Apple’s senior leadership, who received memos almost daily complaining about his temper, his dismissive attitude toward the Apple II platform that was the only thing supporting the company, and his refusal to listen to reason when one of his sacred precepts was threatened. Jobs’s headstrong authoritarianism had been a big contributor to the debacle that was the Apple III launch. (Traditional wisdom, as well as an earlier version of this article, would have it that Jobs’s insistence that the Apple III ship without a cooling fan led directly to the hardware problems that left Apple IIIs dying on buyers’ desks by the thousands. It does, however, appear that this version of events is at least questionable; see the comments section for more about that. Be that as it may, everyone involved would agree that Jobs did an already muddled project no favors.) The Apple III never recovered, and would pass into history as Apple’s first flop. Now he was sowing the same chaos within the Lisa project, a computer the company simply couldn’t afford to let go the same way as the Apple III. Scott and Markkula forcibly removed him from Lisa in late 1980. They would have liked for him to just content himself with enjoying his post-IPO millions and accepting the occasional medal at the White House as a symbol of the American entrepreneurial spirit while they got on with actually running his company for him. They would have liked, in other words, for Jobs to be like Wozniak, who dipped in and out of the occasional engineering project but mostly was happy to spend his time organizing rock festivals and finishing his education and learning to fly an airplane and generally having all the good times he’d missed during a youth spent with his head buried in circuit boards. Jobs, alas, was not so pliable. He wanted an active role at what was after all still in some moral sense his company. Trouble was, every time he took an active role in anything at all anger and failure followed. Thus his forcible eviction from Lisa while it still looked salvageable. But at the same time Apple certainly couldn’t afford an ugly break with their founder and entrepreneurial golden boy. When a hurt Jobs started to lick his wounds from Lisa not through ugly public recriminations but by interesting himself in Raskin’s strictly small-time Macintosh project, the executives therefore took it as very good news. Let him tinker and meddle to his heart’s content with that little vanity project.

But Jobs’s interest was very bad news for one Jef Raskin. Never really technical himself, Jobs nevertheless knew very well how technical people thought. He innocently suggested to Burrell Smith that he might dump the plebeian old Motorola 6809 in favor of the sexy new 68000 that the Lisa people were using, and double the Mac’s memory to 128 K while he was at it. That was an offer no hardware hacker could resist. With Smith successfully subverted, it was just a matter of time. Raskin wrote furious memos to upper management about Jobs’s unauthorized takeover of his project, but they fell on predictably deaf ears. Instead, in early 1981 the takeover was made official. Jobs condescendingly offered Raskin the opportunity to stay with the Macintosh in the role of technical writer. Raskin, who by all indications had an ego almost as big as Jobs’s own, refused indignantly and walked out. He never forgave Jobs for co-opting his vision and stealing his project, remaining convinced until his death in 2005 that his Macintosh would have been better for Apple and better for the world than Jobs’s.

For all that the project had been in existence for over eighteen months already, there was very little really to Macintosh at the time of the takeover — just Raskin’s voluminous writings and some crude hardware based on an obsolete chip that resoundingly failed to live up to the visions expressed in The Book of Macintosh. Thus one could say that the real story of the Macintosh, the story of the machine that Jobs would finally unveil in January of 1984, begins here. Which is not to say that Jobs discarded Raskin’s vision entirely; he had after all been originally drawn to the project by the ideas inside The Book of Macintosh. Although the $1000 goal would be quietly dropped in fairly short order, the new machine should nevertheless be inexpensive at least in comparison to the Lisa, should stress elegance and simplicity and the needs of everyday non-computer people above all else. Jobs, however, shared none of Raskin’s skepticism about mice and menus. He had bought the GUI religion hook, line, and sinker, and intended the graphical user interface to be every bit as integral to the Macintosh as it was to the Lisa. Hell, if he could find a way to make it more so he’d do that too.

Much of the original Mac team: Bill Atkinson, Andy Hertzfeld, Chris Espinosa, George Crow, Joanna Hoffman, Burrell Smith, and Jerry Manock. Taking a leaf from Electronic Arts's playbook, Apple photographed them often in artful poses like this one during the Mac's initial promotional push.

Much of the original Mac team: Bill Atkinson, Andy Hertzfeld, Chris Espinosa, George Crow, Joanna Hoffman, Burrell Smith, and Jerry Manock. Taking a leaf from Electronic Arts’s playbook, Apple photographed them often in artful poses like this one during the Mac’s initial promotional push.

Still with pull within Apple the likes of which Raskin could only dream of, Jobs began assembling a group of stars to start over and make his version of Macintosh. Joining Smith the hardware guru were additional hardware engineer George Crow; programmers Andy Hertzfeld, Larry Kenyon, Chris Espinosa, Bruce Horn, Steve Capps, Bud Tribble, and Bill Atkinson; industrial designer Jerry Manock to shape the external look and feel of the machine; Susan Kare to shape the internal look and feel as designer of graphics, icons, and fonts; and Joanna Hoffman as writer, marketer, and the team’s face to the outside world, the first “Mac evangelist.” Jobs even briefly recruited Wozniak, but the latter found it hard to stay focused on the Mac, as he would just about every other project after his Apple II masterpieces, and soon wandered off again. Others would come and go, but the names listed above were the core of the team that would, just as Jobs so often promised them was inevitable, change the world.

Jobs deliberately fostered an “us against the world” mentality, with the world in this case apparently including the rest of Apple — particularly the much larger and more bureaucratic Lisa team. His dictum that “It’s better to be pirates than to join the Navy” shaped the Mac team’s conception of itself as a brilliant little band of rebels out to make a better world for everyone. They even took to flying a skull-and-crossbones flag outside their offices on the Apple campus. They were united by a sincere belief that the work they were doing mattered. “We all felt as though we had missed the civil-rights movement,” said one later. “We had missed Vietnam. What we had was Macintosh.” Their pranks and adventures have become computer-industry folklore (literally; Andy Hertzfeld’s longstanding website is full of them, and makes great reading).

Of course, one person’s genius at work is another’s self-entitled jerk. A joke was soon making the rounds at Apple:

How many Macintosh Division employees do you need to change a light bulb?

One. He holds the bulb up and lets the universe revolve around him.

Perhaps the people with the most justification for feeling aggrieved were those poor plodding pedants — in Jobs’s view, anyway — of the Lisa team. As Steve Capps would later put it, “A lot of people think we ripped off Xerox. But really we ripped off Lisa.”

To say that the Mac could not have existed without Lisa is no way an overstatement. Mac was quite literally built on Lisa; for a long time the only way to program it was via one of the prototype Lisas installed in the team’s office. The Mac people watched everything the Lisa people did carefully, then reaped the fruit of whatever labor seemed useful to them. They happily digested the conclusions of the Lisa team’s exhaustive user testing of various designs and interfaces and built them into Mac. They took Bill Atkinson’s QuickDraw, the core rendering layer at the base of the Lisa’s bitmapped display, for the Mac. Later, Jobs managed to take its programmer as well; in addition to QuickDraw, Atkinson became the author of the MacPaint application. Yes, Jobs proved surprisingly willing to borrow from the work of a team he dismissed as unimaginative plodders. The brilliance of the people involved is one answer to the question of how Macintosh was created by so few. Lisa, however, is another.

The Mac people regarded their leader with a combination of awe and bemused tolerance. It was team member Bud Tribble who coined perhaps the most famous of all descriptions for Jobs’s unique charisma, that of the “reality distortion field.” “In his presence,” noted Tribble, “reality is malleable. He can convince anyone of practically anything.” Tribble elaborated further on Jobs’s unique style:

Just because he tells you that something is awful or great, it doesn’t necessarily mean he’ll feel that way tomorrow. You have to low-pass filter his input. And then, he’s really funny about ideas. If you tell him a new idea, he’ll usually tell you that he thinks it’s stupid. But then, if he actually likes it, exactly one week later he’ll come back to you and propose your idea to you, as if he thought of it.

The aforementioned reality distortion field kept this sort of behavior from seeming as obnoxious as it would have from just about anyone else. Anyway, everyone was well aware that it was only because of Jobs’s patronage that the Mac project was tolerated at all at Apple. This little group of pirates, convinced that what they were doing was indeed (to choose another of Jobs’s catchphrases) “insanely great,” something that would change the world, knew that they owed the vision and the opportunity for Macintosh to Jobs. Atkinson later noted that “You only get one chance to change the world. Nothing else matters as much — you’ll have another chance to have vacations, have kids.” Most people, of course, don’t ever even get one chance. He and the rest of them owed theirs to Jobs.

Thankful as they were, they were hardly mindless disciples. They did their best to redirect his course when he got details as wrong as he got the big-picture vision right. When their reasoning failed, as it usually did with the imperious Jobs, they did their best to subvert him and/or to minimize the damage.

The list of bad decisions Jobs made about Macintosh is long, easily long enough to torpedo virtually any other computer. He insisted that the Mac use the same horrifically unreliable in-house-designed “Twiggy” disk drives as the Lisa, an example of borrowing a bit too much from Mac’s older sister. He rejected categorically pleas that the Mac at least have the option of memory expansion beyond 128 K, insisting that doing so would just encourage programming inefficiency and turn the Macintosh into a bloated monster like Lisa; his team’s arguments that a bitmapped, GUI-driven operating system running under a 16-bit processor required by its very nature vastly more memory than something like the Apple II got them nowhere. He rejected an internal hard drive because it would require that most hated of all pieces of technology, a noisy fan. He rejected a second internal floppy drive because there wouldn’t be room in Jerry Manock’s sleekly elegant case, plus bloat and all that. He tried to kill the Apple LaserWriter, a product that would prove almost as significant for the company as the Mac itself and without which the Mac may very well have not survived beyond its first couple of years. He cut short all discussion of networking by pulling out a floppy disk and pronouncing, “Here’s your network!” (The laser printer and Ethernet, those two other parts of the PARC gospel, had most resoundingly not reached Jobs during his famous visit.) He even refused to permit cursor keys on the keyboard, saying that the mouse was the only proper way to move the cursor in this new paradigm of computing.

The original Mac keyboard, complete with no cursor keys

The original Mac keyboard, complete with no cursor keys

People did what they could in the face of this litany. Burrell Smith made sure the Mac was capable of accommodating 3.5-inch floppy drives, the emerging industry standard soon to replace the older 5.25-inch floppies, as well as the Twiggy. When Lisa debuted a year ahead of the Mac and the Twiggy drives proved a disaster, the Mac manufacturing team was able to easily slot the 3.5-inch drives in in their place. (Taking the fall for Twiggy was another great service Lisa did Macintosh.) Everyone also made sure that the Mac was ready to accommodate more memory on both the hardware and software side, for when the realization finally dawned that 128 K just wasn’t going to cut it. (That realization began to dawn quite early even for Jobs; the machine he unveiled to press and public on January 24, 1984, had in fact been hacked to have 512 K. Otherwise the presentation would have been a less impressive one altogether, with a lot more time spent waiting for the Mac to deign to do something and none of the cool synthesized speech.) For most of the rest, there wasn’t much for it but to hope the machine did well enough with the early adopters that they could go back and fix the problems later. Cooler heads in management did at least prevail to save the LaserWriter.

On the hardware side, the Macintosh was smartly but minimalistly designed by Burrell Smith, a huge admirer of Steve Wozniak who strained to craft the same sort of elegant circuitry for the Mac that Woz had for the Apple II. For all that it was clean and compact, however, the Mac wasn’t terribly interesting or impressive as a piece of hardware. Jobs, from a contemporary interview in Byte magazine:

By paying a little more for the microprocessor, not only were we able to give the customer an infinitely more powerful chip than, say, an 8-bit chip or one of Intel’s baby micros, but we were able to pick up this amazing software [referring here to Bill Atkinson’s QuickDraw layer sourced from the Lisa project], and that allowed us to throw tons of chips out of this thing. We didn’t have to get special custom text or graphics chips. We just simplified the system down to where it’s just a bitmap on the screen, just Bill’s amazing software and Burrell’s amazing hardware, then in between that the other amazing software that we have. We tried to do this in every single way, with the disk and with the I/O…

The Macintosh, in other words, asks a hell of a lot of its 68000 CPU, something it could get away with because, well, it was a 68000, the most powerful reasonably priced chip in the industry of the time. A person reading that Byte interview might have asked what the 68000 could do with a bit more support in the hardware around it. That question would be answered in fairly resounding form by later 68000-based machines, most notably the Amiga, which could run rings around the Mac.

But of course that line of argument is a touch unfair; not only was the Mac first to the 68000 architecture, but it was also the first PC in the world to be principally defined not by its hardware but by its software. And the newly minted MacOS was indeed a brilliant creation, one that went in many ways far beyond what its legendary predecessors at Xerox PARC had managed. Incredible as the Xerox Alto was, there’s a hell of a lot that’s become a standard part of GUIs everywhere that dates not from the Xerox of the 1970s but from the Apple of the early 1980s. Amongst these are such basic building blocks as pull-down menus and even the idea of windows as draggable entities that can overlap and be stacked atop one another; on the Alto they were non-overlapping tiles fixed in place (as they also were, incidentally, in the earliest versions of Microsoft Windows). One of Jobs’s favorite aphorisms during the final frantic year of Mac development was “Real Artists Ship!” This was something the tinkerers and theorists at PARC never quite managed to do. As anyone who’s ever finished a big creative project knows, the work of polishing and perfecting usually absorbs far more time and effort — and tedious, difficult effort at that — than hammering out the rough concept ever does. Apple did this heavy lifting, thus enshrining Xerox PARC as well as the Mac itself forever in computing legend. And they did it well — phenomenally well. I have my problems with Apple then and now, but this should never be forgotten.

As the Mac began to assume concrete form at the beginning of 1983, Jobs’s star at Apple was again in the ascendent. After years of muddled leadership from Michael Scott and Mike Markkula, the company had finally decided that a more dynamic leader was needed. Scott and Markkula had been Silicon Valley insiders steeped in engineering detail; Markkula had personally contributed code, testing, and documentation to the company’s early projects. To bring to fruition Jobs’s vision for Apple as a great mainstream company, known and loved by the masses, a very different sort of leader would be needed. Ideally, of course, that leader would be him, but Apple’s board wasn’t that crazy. As a second-best alternative, Jobs became enamored with a very unconventional choice indeed: a marketing expert and polished East Coast blue blood who was currently running the Pepsi brand. His name was John Sculley, and it was doubtful whether he even would know how to turn on one of Apple’s computers.

Steve Jobs and John Sculley at the Mac's public introduction on January 24, 1984.

Steve Jobs and John Sculley at the Mac’s public introduction on January 24, 1984.

Even had he never hooked up with Apple, Sculley’s name would be enshrined in business lore and MBA syllabi. Not yet 45 when Jobs’s courtship began, Sculley was already a decorated general of the Cola Wars. He had been one of the pioneers of what would come to be called “lifestyle advertising.” You know the sort of thing: all those advertisements that show cool, pretty people doing interesting things whilst listening to the hippest music and, oh, yes, just happening to enjoy a Pepsi while they’re about it. (“Come alive — you’re in the Pepsi Generation!”) “Boy,” thinks the consumer, “I’d like to be like those people.” And next time she’s at the grocery store, she picks up a six-pack of Pepsi. It sounds absurd, but, as one look at your television screen will tell you, it’s very, very effective. Very few of us are immune; I must sheepishly admit that I once bought a Volkswagen thanks largely to a certain advertisement featuring a certain Nick Drake song. As Mad Men has since taught all of us and Sculley grasped decades ago, the cleverest advertising doesn’t sell us a product; it sells us possibility. The best examples of the lifestyle form, like that Volkswagen spot, can be compelling and inspired and even beautiful.

If that wasn’t enough, Sculley was later instrumental to the most legendary Cola Wars campaign of all time, the Pepsi Challenge, which cleverly combined the lifestyle approach with the more conventional hard sell. The advertisements showed that it just happened to be the cool, attractive people — many of them hip young celebrities and athletes — who preferred the taste of Pepsi to that of Coke. The ads were everywhere, an inescapable part of the cultural landscape of the late 1970s and early 1980s. And, judging by the relative sales trends of Coke and Pepsi, they were very, very effective; for the root cause of the “New Coke” fiasco of the mid-1980s, look no further.

Now Jobs wanted Sculley to do the same thing for Apple, to craft for the company an identity that transcended the specifications sheets and price comparisons that sullied their competitors. To some extent Apple already enjoyed a special status; their compelling origin story and the charisma of their two young founders along with the engaging personality of their signature creation the Apple II gave them a cachet of which drabber, more conventional companies, executives, and computers could only dream. Now Jobs believed he and Sculley together could leverage that image to make an Apple computer the hippest lifestyle accessory of the 1980s. There was more than a little bit of utopian fervor to Jobs’s vision, part and parcel of that strange intermingling of hardheaded business ambition and counterculture idealism that has always seen Jobs and the company he founded selling a better world for a rather steep price. Jobs’s deal-closing pitch to Sculley, which may never have actually passed his lips in such pithy form, has nevertheless gone down into Apple lore: “Do you want to sell sugar water for the rest of your life, or do you want to come with me and change the world?” How could anyone refuse?

It became increasingly clear as 1983 wore on and Sculley settled into his new West Coast digs that the specific Apple computer that would be doing the world-changing must be the Macintosh. The Lisa was a flop, done in by intrinsic failings, like the unreliable Twiggy drives and its beautiful but molasses-slow GUI, and some extrinsic ones, like its high price and the uncertainty of big business — the only people who could realistically buy the thing — over what it really was good for. Nor did Jobs’s persistent whispers to reporters to just wait, that something cheaper and even better was coming soon, do the Lisa any favors.

Still, by many measures the Mac was not only cheaper but better than Lisa. Its 68000 architecture may have been unexceptional, but so was the Lisa’s — and the Mac’s 68000 was clocked at 8 MHz, a full 3 MHz faster than the Lisa’s. The Mac’s operating system was slim and lightweight, written in pure 68000 assembly language, as opposed to the Lisa’s bigger and more ambitious (overambitious?) operating system which was mostly written in Pascal. There was a price to be paid for the Mac’s slim efficiency; in some areas like multitasking and memory protection MacOS wouldn’t fully equal LisaOS until the arrival of OS X in 2001. But an average user just trying to get stuff done will make lots of compromises to have a snappy, usable interface — something which, at least in contrast to the Lisa, the Mac had in spades.

Condemned as a backwater project with little relevance to Apple’s business-centric corporate direction for years, as Macintosh geared up for the big launch Jobs and his band of pirates now found themselves taking center stage. Macintosh was now the future of Apple; Macintosh simply had to succeed. The last five years at Apple had been marked by the ever-greater success of the Apple II almost in spite of its parent company and two colossal and expensive failures to develop a viable successor to that beloved platform. Apple was still a major force in the PC industry, with yearly revenues approach $1 billion. Yet they were also in a desperately precarious position, dependent as they still were on the archaic Apple II technology and their absurdly high profit margins on same. At some point people had to stop buying the Apple II, which was now thoroughly outclassed in some areas (notably graphics and sound) by competition like the Commodore 64 that cost a fraction of the price. With the Apple III and the Lisa lost causes, the Macintosh by default represented Apple’s last chance to field a viable bridge to the Apple II-less future that had to come one of these days. Given the age of the Apple II, it was highly doubtful whether they would have time to go back to the drawing board and create yet another new machine for yet another kick at the can. The Mac represented their third strike; it was Mac or bust. Steve Jobs and his team reveled in it and prepared to change the world.

The Macintosh was announced to the world on January 22, 1984. Early in the third quarter of Super Bowl XVIII and not long after one of IBM’s Charlie Chaplin spots for the ill-fated PCjr, an audience bloated with chips and beer and bored with a rather lackluster football game saw this, the most famous Super Bowl advertisement of all time.

Most people had no idea whatsoever what Apple was on about, had no idea that Big Brother represented the hated IBM who had taken the lead in business computing that Apple felt was rightfully theirs. The commercial was the talk of the media for the next few days, as everyone speculated about just what this “Macintosh” thing was and what it had to do with hammer-hurdling freedom fighters. The advertisement, which it soon emerged had been directed by none other than that master of dystopia Ridley Scott of Alien and Blade Runner fame, would never darken a television screen again. No need; it had done its job, and would go down into history alongside Lyndon Johnson’s “Daisy” ad as one of the two most famous one-and-done commercials of all time.

The “1984” spot was an overheated, rather adolescent piece of rhetoric, coming off almost like a caricature of Apple’s exaggerated self-importance. It was by no means beloved by everyone even within Apple. The Mac’s moving up to become the company’s biggest priority hadn’t change the determination of most of their executive wing to make it not as a maker of home and hobbyist computers, a competitor to Commodore and Atari and Radio Shack, but as a player in the much more lucrative field of business computing, where IBM (and, increasingly, IBM clones, a story for another time) ruled. Meanwhile Jobs still saw the Macintosh as he always had, as a way of changing not just the business world but the world full stop — which didn’t quite mean that he wanted to get down in the trenches with the likes of Commodore either, mind you, but also funneled his ambitions for the platform in a very different direction. Caught somewhere in the middle was John Sculley, a man who had been brought in thanks to his prowess as a consumer marketer but was nevertheless beholden to both factions. The constant push and pull between them, and the mixed messaging that resulted, would very nearly sink the Mac. Just before the Mac’s introduction, the business faction pushed through a rise in the list price from $2000 to a more businesslike $2500. But then came the “1984” commercial, whose lurid tone was all but guaranteed to repulse exactly the corporate leaders the business interests wanted to attract; these folks identified more with Big Brother than with the hammer-wielding freedom fighter. It would go on like that for a long time.

At the official launch on January 24, Jobs publicly committed Apple to the goal of selling 50,000 Macs in the first hundred days. It was dangerously ambitious; to miss the goal would be embarrassing and momentum-killing. In the end they managed it and then some; sales reached some 70,000, and they might have sold even more if not for teething problems at the factory typical of a new computer. Virtually all of the machines they sold, however, went not to corporations but to forward-thinking individuals of a certain technological bent and disposable income who rightly recognized in the Mac a new future paradigm. Douglas Adams, who saw his first Mac in Infocom’s offices and promptly fell in love, was archetypical of the demographic.

All of which was fine as far as it went — Apple was happy to sell to individuals too if they had the money to buy — but didn’t do a lot to further the dream of the Mac as a rival to the IBM PC on the desks of corporate America. Equally frustrating was much of the software that appeared that first year, which often tended toward games and other frivolous stuff frowned upon by corporations. By year’s end the early adopters with disposable income were already looking exhausted and corporations still weren’t buying. The result was tens of thousands of Macs piling up in warehouses and cancelled production orders. At year end total sales amounted to 250,000, about half of Jobs’s projections at launch time. And sales were getting worse every month, not better. It was beginning to look disconcertingly like Strike 3 — Apple III and Lisa all over again. The only thing keeping the company in the black was still the inexplicably evergreen Apple II, which in 1984, that supposed Year of the Macintosh, enjoyed its best sales yet. Revenue from the Apple II amounted to 2.5 times that from the Mac. Apple II loyalists, who despite Apple’s official claims of “Apple II Forever!” could see where the company’s real priorities lay, took no small delight in this reality.

Joanna Hoffman, the marketer who was with the Mac project almost from the beginning, frankly admitted later that the sales results were, at least in retrospect, unsurprising.

It’s a miracle that it sold anything at all. This was a computer with a single disk drive, no memory capacity, and almost no applications. People who bought it did so on seduction. It was not a rational buy. It was astonishing that Macintosh sold as many as it did.

Or, as Douglas Adams put it:

What I (and I think everybody else who bought the machine in the early days) fell in love with was not the machine itself, which was ridiculously slow and underpowered, but a romantic idea of the machine. And that romantic idea had to sustain me through the realities of actually working on the 128 K Mac€.

Those realities could be hellish. The single floppy drive combined with the inadequate memory could make the original Mac as excruciating to actually use as it was fun to wax poetic about, with the process of just copying a single disk requiring more than fifty disk swaps and twenty minutes. MacWrite, the Mac’s flagship version of that bedrock of business applications the word processor, was so starved for memory that you could only create a document of about eight pages. Determined Mac zealots swapped tips on how to chain files together to craft their Great American Novels, while the business world just shrugged and turned back to their ugly but functional WordStar screens. The Mac was a toy, at best an interesting curiosity; IBM was still the choice for real work.

"Test Drive" ad campaign

Sculley did his best to apply his Pepsi marketing genius to the Mac, but found it tough sledding. That Christmas Apple began the “Test Drive a Macintosh” campaign, which — shades of the Pepsi Challenge — let prospective buyers take a machine home for free to play with for 24 hours. Some 200,000 did so, but very few actually bought afterward, leaving stores with nothing but a bunch of used Macs to show for their trouble. For the 1985 Super Bowl, Apple attempted to recapture some of the Mac’s launch buzz with another high-concept commercial, this one depicting IBM users as mindless lemmings trudging off the side of a cliff. Ridley Scott’s brother Tony did the directing honors this time between pre-production work on Top Gun. But by now it all just felt kind of trite and childish, not to mention insulting to the very businesspeople Apple was trying to win over. Reaction from corporate America was so negative that Apple briefly considered taking out a full-page apology in the Wall Street Journal.

Apple’s summer of discontent, the rock-bottom point for the Mac, came in 1985. Not only were Mac sales still moribund, but by then another terrifying reality was becoming clear: Apple II sales were also slowing. The previous year had at last been the top of the bell curve. The day they had dreaded loomed, the day when they would have no viable next-generation machine and no faithful Apple II to fall back on. Apple closed three of their six factories and laid off 20 percent of their workforce, some 1450 people, that bleak summer.

Shortly after, Steve Jobs finally walked away from Apple following an acrimonious split with his erstwhile best mate John Sculley and a clumsy failed coup in the Apple boardroom. Jobs had proved psychologically incapable of accepting or addressing the Mac’s failings as both a piece of computer hardware and as a marketplace proposition. Jay Elliott, Apple’s head of human resources, summed up his situation beautifully:

[Jobs] could see that horizon out there, a thousand miles out. But he could never see the details of each little mile that had to be covered to get there. That was his genius and his downfall.

The Macintosh, like Apple itself, needed a practical repairman in 1985, not a bold visionary. This was a role Jobs was, at least at this phase of his life, eminently unqualified to play. And so he had made life intolerable for everyone, until the ugly public split that several generations of previous Apple management had only just found ways to avoid had come at last. The famed Apple mojo seemed all but gone, lost along with their charismatic founder.

But, as happens often (if not quite often enough) in business as in life, that summer proved to be the darkness before the dawn. Apple’s engineers had not been idle while the Mac struggled through its difficult first year, but had rather set doggedly to work to correct the worst of its failings. An external floppy drive became available a few months after launch, greatly alleviating the hell of disk swapping. The so-called “Fat Mac” with 512 K of memory, the amount most of the development team not named Jobs had agreed was appropriate from the start, appeared late in 1984. A hard disk and even cursor keys — their lack had been one of the more loathed aspects of the original machine if also a boon for makers of add-on keypads — were in the offing, as was, slowly and painfully, a workable networking system. The loss of Jobs only made such alleged dilutions of his vision easier to accomplish. The buggy original systems software was slowly tweaked and upgraded, while a third-party software ecosystem steadily grew on the backs of enthusiastic early adopters with money to spend. It didn’t come as quickly as Apple would have liked, and much of it wasn’t initially as businesslike as they might have liked, but the software — and with it a burgeoning community of famously loyal users — did come. Indeed, it was a third-party developer who arguably saved the Macintosh in tandem with another product of Apple’s busy engineering staff.

Paul Brainerd was a techie with a background in publishing who had for some time dreamed of finding a way to revolutionize the complicated and expensive process of traditional typesetting — pasteboards, huge industrial printers, and all the rest — through microcomputer technology. He had been stymied by two sore lacks: a computer with a high-resolution graphics display capable of showing what a document would look like on the printed page, pictures and all; and a printer capable of producing said document on paper. When he saw the Mac for the first time, he recognized that one of these needs had been met at last. When he reached out to Apple, they let him in on a secret: they had a solution for the other in the works as well, in the form of the LaserWriter, an affordable — in publishing terms; it would cost about $7000 — laser printer. The combination of the Mac, the LaserWriter, and the software Brainerd would eventually produce to make use of them, Aldus PageMaker, would invent the field of desktop publishing and change everything for the Mac and for Apple.

Like so much else about the Mac, it wasn’t an entirely original concept. Way back in circa 1975, Ginn & Co., a textbook publisher and Xerox subsidiary out of Boston, were gifted by the researchers at PARC with some Altos and a custom interface to hook them up to a big Dover laser printer. Ginn became the first all-digital publisher in the world. “Initially the reaction to the concept was, ‘You’re going to have to drag me kicking and screaming,'” said Tim Mott, one of the PARC people chiefly responsible for the project. “But everyone who sat in front of that system and used it, to a person, was a convert within an hour.” It was in fact Ginn’s editors who coined the ubiquitous terms “cut” and “paste,” a reference to the old manual process of cutting out manuscripts and photographs and pasting them onto pasteboard for typesetting. Now, a decade later, the rest of the world would finally get the opportunity to follow Ginn’s lead. The Mac had its killer app for business at last.

In retrospect it should have been obvious. It had been obvious to Xerox, hardly a company revered for vision; their big attempt to package PARC’s innovations into commercial form had come with the Xerox Star, a “document-processing workstation” that was essentially a sneak preview of desktop publishing before the term existed. But Apple, and especially Jobs, had been so focused on the Macintosh as a revolutionary force of nature in all aspects of the human condition that they’d had trouble thinking in terms of the concrete, practical applications that made corporations buy computers.

Publishers loved PageMaker. It turned what had been an all-night, all-hands-on-deck process, a hot, dirty nightmare of paste and print and paper for countless small periodicals and corporate publishing departments into something almost painless, something downright fun. Apple came to call PageMaker and its competitors, which were soon springing up like toadstools after a rain, their Trojan Horses. A brave purchasing manager would buy a couple of Macs and a LaserWriter as an experiment, and six months later the same company would be coming back for fifty or a hundred more. Publishing would become the first of several creative niche industries that the Mac would absolutely own, even as IBM continued to dominate the mainstream of business. It wasn’t quite the grand head-to-head challenge that Jobs had dreamed of, but, combined with sales of the Apple II that would remain on the descendent but surprisingly strong for the rest of the decade, it was a pretty good living.

Apple had been very, very lucky; they and the Mac had blundered through somehow. David Bunnell, longtime publisher of MacWorld magazine, summarized the Mac’s formative years bluntly:

To hold up the Macintosh experience as an example of how to create a great product, launch an industry, or spark a revolution is a cruel joke. Anyone who models their business startup on the Macintosh startup is doomed to failure. Miracles like the Macintosh can only happen once.

If the bargain with practicality represented by the Macintosh as desktop-publishing specialist seems disheartening, consider how genuinely empowering just this application was to countless people. For it wasn’t just big or medium-sized companies who bought Macs for this purpose. Especially as the prices of software and hardware came down, the small printers, the neighborhood associations, the church groups could also get in on the act. It’s astonishing how ugly the average fanzine or newsletter of 1980 is compared to that of 1995. The difference is almost entirely down to the Macintosh, which let people get their messages out there in a form of which no one need be embarrassed. Many, like a young man named Eliot Cohen who used his Mac to start a popular newsletter focusing on his obsession of New York Mets baseball and soon found himself in the locker room interviewing his heroes as the slick magazines called to beg for his insights, credited the Mac with literally changing their lives. This democratizing of the means of production is one of the most inspiring outcomes of the PC revolution and, much as I’m ambivalent about some aspects of the platform and its parent company, of the Mac itself. Indeed, I have a special reason for giving credit where it’s due: the logical successors to the Mac-enabled fanzines that were everywhere by the early 1990s are blogs like this one. We’re still riding that same continuum of change.

Consider also how immense was the Mac’s soft power. People — even people who rejected the Mac itself as an overpriced boondoggle — somehow recognized that this was the way computers really ought to work. It became an ideal, striven for if seldom reached for years. No matter; other computers were better for the striving. Even machines like the lowly Commodore 64 soon housed their own valiant attempts at replicating MacOS. To really get the scope of the changes wrought by the Mac, one need only compare the average Commodore 64 or Apple II game of, say, 1983 and 1986. A friendly GUI interface, of the sort which felt revolutionary when it appeared in the landmark Pinball Construction Set in 1983, was practically the baseline norm by 1986. The hardware hadn’t changed a whit; the vision of what could be done with it had. So, the Macintosh really did end up changing the world. Steve Jobs, wrong about so many nitpicky things, was breathtakingly right about that.

(The Macintosh story has been told so often and by so many that the biggest problem in writing an article like this one is sorting through it all and trying to inject some grounding into the more evangelistic accounts. My primary book sources were Insanely Great by Steven Levy; West of Eden by Frank Rose; Apple Confidential by Owen Linzmayer; and Dealers of Lightning by Michael A. Hiltzik. Andy Hertzfeld’s is also a goldmine. The Byte quote given above is from the February 1984 issue, part of a series of features greeting the Mac’s arrival. Various episodes of Computer Chronicles, archived by the dedicated folks at, also informed the article. See in particular “Mainframes to Minis to Micros”; “Integrated Software”; “Printers”; “Computer Ergonomics”; “The Macintosh Computer”; “Computer Graphics”; “Slowdown in the Silicon Valley” Parts One and Two; “Printers and Business Graphics”; and “Desktop Publishing” Parts One and Two. The photos sprinkled through the article are from Apple Confidential, except for the picture of the original Mac keyboard, which was taken from the aforementioned issue of Byte.)


Posted by on February 20, 2014 in Digital Antiquaria, Interactive Fiction



29 Responses to Macintosh

  1. Keith Palmer

    February 20, 2014 at 11:02 pm

    When I noticed what seemed an elision in your “American hardware in 1984” piece, I got to musing how to begin using a Macintosh (as my own family did at the end of 1992, when it was pretty clear our Tandy Color Computer had reached a certain end of the line) is in some ways to detach one’s self from the main current of computer gaming. (However, it may be that very detachment that keeps me interested in the history of it.) Then, though, I did begin to think of all the ways I knew a piece on the early Macintosh could be made skeptical and critical (with the certain possibility of extending that all the way to now), perhaps especially so in the context of having written about Apple II and Commodore 64 games with a book about the Amiga in the mix too…

    However, as loaded as words like “balanced” and “even” might be made to sound, after I’d read my way through this piece I want to say they do seem to apply. I was particularly struck by your portrayal of Jef Raskin’s project (for which, perhaps, the initial idea of a 6809 processor does make think of the Color Computer and makes me want to know more about how its software would have worked, even if I haven’t done an awful lot of actual looking for that information) and the brief note about how Steve Jobs’s usurpation of it might have been seen from afar as sort of like Steve Wozniak’s own career arc at Apple.

    This does feel unfortunately like a somewhat grisly correction to suggest, but in seeing Raskin’s later opinions of Jobs described in the present tense I remembered how I’d heard he had died in 2005. It also seems part of the folklore that the price of the Macintosh rising to $2500 was to pay for an advertising blitz, a much more minor note perhaps. (In letting that thought shape my own impressions of “little things getting to be big problems later,” though, I was surprised when I managed a little while ago to buy some early issues of Macworld magazine for a new perspective and noticed the “128K” price was cut back to $2000 when the Macintosh 512K was introduced… although there, of course, the impression might have been the more expensive model was still the only one actually usable.)

    • Jimmy Maher

      February 21, 2014 at 7:46 am

      You’re certainly right that the Mac was never a big mainstream gaming machine, but it’s nevertheless a fascinating platform for anyone interested in creative computing in that a ton of groundbreaking software appeared there which had a huge impact on mainstream (and non-mainstream) gaming and ludic narrative on other platforms. Stuff like StorySpace, HyperCard, The Fool’s Errand, and of course Myst…

      Also earlier stuff in the more immediate wake of the launch, which I’ll be talking about in my next article and so won’t mention here so as to preserve the suspense. ;)

      • Lubo

        April 11, 2015 at 10:26 pm

        Props for the references to software that changed “everything”. Especially Hypercard and Fool’s Errand!

        Several more to add:

        Quicktime 1.0 (circa 1991?) Before then it was audio OR video (.wav, .au, .snd OR mpg1) Not until DVDs and mpg2 did anyone seriously combine them. But in 1991, DVD was not mainstream, VHS and Beta were, with a smattering of the stillborn Laserdisc (the size of records!) Enter Quicktime. How many failed intel/microsoft video codecs were abandoned when windows quicktime appeared? A lot I think.

        Several awesome games: Silicon Beach Airborne, Dark Castle, Wizardry 1 (light-years ahead of the ugly PC ports, thanks to the Mac GUI), Archon (again GUI!) Firefox. Even ten years later when Civilization 1 was finally ported to the Mac, having been on the PC first, it looked better than the “refreshed and updated” PC port. Not until Civ 2 did they achieve anything close to parity.

    • Ian

      February 23, 2014 at 8:39 pm

      Raskin’s “original Macintosh” concept surfaced most fully-formed in the form of the little-known Canon Cat, although even that shipped with a 68000.

      He also was responsible for the SwyftCard, which you plugged into an Apple IIe and took it completely over, making it look and feel very much like a cut-down version of the Cat.

      • Keith Palmer

        February 25, 2014 at 2:58 am

        I did know about Jef Raskin creating the Swyftcard and Canon Cat, but not the specifics of how they worked. A trivial search just now, though, turned up the site, which has contemporary manuals and documents for both.

  2. ZUrlocker

    February 20, 2014 at 11:37 pm

    Great post. I bought a 512k “fat mac” as soon as they became available. But it was still limiting to have a single floppy drive. So I bought an internally installed GCC HyperDrive hard disk that gave me a whopping 10mb of storage. I thought I’d never fill that up! It was a great machine but programming for the Mac was not an easy thing, even with great tools like Lightspeed Pascal.

    BTW, you misspelled Alan Kay’s name with an “e” a couple of times.

    • Jimmy Maher

      February 21, 2014 at 6:37 am


  3. Scott

    February 21, 2014 at 1:18 am

    Loved this post – I’m just coming to the tail end of Jobs’ biography so to see these events in video rather than just text was great.

    And I don’t mind the Lemmings commercial, I thought it was quite clever :)

  4. Alex Smith

    February 21, 2014 at 4:12 am

    Great stuff as always. One minor point though: saying Raskin has never forgiven Jobs to this day implies he is still living. Sadly, Jef Rakin died in 2005.

    • Jimmy Maher

      February 21, 2014 at 6:32 am

      Thanks (you also, Keith). I missed the blinding fact that he’d died in the midst of all the more nitty-gritty stuff.

  5. chris

    February 21, 2014 at 4:31 am

    I’m a little skeptical of some of your background.

    He rejected an internal hard drive because it would require that most hated of all pieces of technology, a noisy fan.

    Do you recall what hard drives cost, back in 1982-1984? It would have driven the price through the roof. They were trying to keep it under $2000 (and they didn’t manage to succeed).

    He rejected a second internal floppy drive because there wouldn’t be room in Jerry Manock’s sleekly elegant case, plus bloat and all that.

    Cost is a likely issue here, too. These things are commodities now. Not then. This paragraph seems unfairly dismissive of what were probably more than aesthetic concerns.

    He tried to kill the Apple LaserWriter, a product that would prove almost as significant for the company as the Mac itself and without which the Mac may very well have not survived beyond its first couple of years

    Where did you find that information? I’ve always heard that he championed the LaserWriter. I’m willing to believe that I’ve been given revisionist history, but I’ve read all the books you cite as references, and more (some longer ago than others, so maybe I’ve forgotten?).

    • Jimmy Maher

      February 21, 2014 at 7:32 am

      Jobs’s initial opposition to the LaserWriter is mentioned on page 211 of Steven Levy’s Insanely Great, where it’s stated that he “adamantly opposed” the project at first but had come around by late 1983.

      Yes, there was no way Apple was going to be able to sell a $2500 Mac with a hard drive, although Apple’s insistence on high profit margins do also enter into play here. When books like Levy’s discuss how hard the Mac team worked to reduce costs, there always the trailing qualifier about Apple’s insistence on preserving the highest profit margins in the industry even on this supposedly “inexpensive” machine. I’m not making value judgments; Apple often put those profits to good use via innovative R&D no one else in the industry would have attempted. But it’s also important to be aware of in any discussion of early Mac pricing.

      Anyway, high profit margins or no hard-drive prices were coming down quickly. Within months you could buy an external 10 MB hard drive for the Mac from others for $1500; an internal hard drive sourced in quantity by Apple and without the need for an external enclosure would have cost them substantially less, although certainly not enough less as to get the price for the whole package anywhere near $2500.

      What I think Jobs’s team would have preferred was simply the *option* to configure a Mac with one, something Jobs vehemently opposed in his determination to make every Mac the same. This approach obviously proved impractical very quickly, as witnessed by the 512 K Mac which appeared within a year and the slots which soon appeared on the models released after Jobs’s departure. (Possibly I should have discussed this more thoroughly in the article.) On the whole it was a favoring of ideology at the expense of practicality, something young people are prone to — Jobs was not yet thirty — and Jobs much more so than most. I feel pretty confident in my assertion that it very nearly sunk the Mac.

      Those people who went out and bought the original Mac certainly had reason to be annoyed when Apple charged them $1000 to upgrade to the 512 K version; the only way to do it was to solder new memory chips onto the motherboards. (It soon emerged that the hardware itself required to do so couldn’t have cost more than $500 and probably a lot less than that to a company with Apple’s clout, leading to a lot of in my opinion justifiable criticism in the computer press. Surely they could have rewarded their earliest adopters by giving them the upgrade at cost…)

      • DZ-Jay

        February 25, 2017 at 12:39 pm

        My I suggest that the objections to your portrayal of Jobs’ in this case is because you provide little context for his decision to maintain the aesthetics. It has been said elsewhere that Jobs vision for the Macintosh was that of an appliance, like a toaster. A nice, slick, inviting toaster, that would do everything you expect a toaster to do for you, without any of the complexities of having to know how it works nor how it could be expanded into something else.

        Implied in this is the recognition that a an appliance, say a toaster, does not come in a multitude of models with differing components. It toasts your bread and that’s that, and avoid any potential confusion for the consumer.

        In this light, his intent on maintaining the minimalist aesthetics and avoiding the bulky boxes of competitors, or the expandability of the Apple ][ seemed, if not justified, at least understandable. I personally do not fault him for this vision, even if it was misguided, and even if I disagree with his method of expressing it.

        Although perhaps tangentially implied by your account, none of this is expressly stated. It just comes out as Jobs did not want options because… Jobs. *shrug*

        He can be very petty in many regards, but this seems to distort him even more than necessary.


    • Ian

      February 23, 2014 at 8:43 pm

      Related: the lack of a fan, while a popular story, has been debunked as the cause of failure of the Apple ///. It was traced to an experimental PCB manufacturing process where adjacent traces would sometimes intermittently short out; revised boards made on the tried-and-true Apple II process worked just fine in the same cases with no fan.

      • Jimmy Maher

        February 24, 2014 at 8:44 am

        Can you point to a source on that? I’m always happy to revise and love to debunk myths, but I have seemingly very credible books on my bookshelf which describe the many of the problems as being down to inadequate cooling. Would just like to be certain of the alternative story before I revise…

        From Apple Confidential:

        “When the first volume shipments began in March 1981, it became apparent that dropping the clock chip was just a finger in the dike. Approximately 20 percent of all Apple IIIs were dead on arrival primarily because chips fell out of loose sockets during shipping. Those that did work initially often failed after minimal use thanks to Jobs’ insistence that the Apple III not have a fan (a design demand he would make again on the Mac). He reasoned that in addition to reducing radio-frequency interference emissions (a severe problem with the Apple II), the internal aluminum chassis would conduct heat and keep the delicate components cool. He was wrong.

        Compounding the problem was that Jobs dictated the size and shape of the case without concern for the demands of the electrical engineers, who were then forced to cram boards into small spaces with little or no ventilation. As the computer was used, its chips got hot, expanded slightly, and slowly worked their way out of their sockets, at which point the computer simply died. Apple’s solution was to recommend lifting the front of the computer six inches off the desktop, then letting it drop with the hope that the chips would reseat themselves!

        The problems with loose chips were exacerbated by short cables between internal components, non-gold connectors, and the circuit board manufacturer’s change in the flux washing process that led to latent corrosion.”

        The last part of this sounds like the issue to which you’re referring, but it also sounds like inadequate cooling was a major factor…

        • Anonymous

          February 25, 2014 at 2:06 am

          Well, in Steven Weyhrich’s recently (finally) published “Sophistication & Simplicity: The Life and Times of the Apple II Computer” (he’s the guy behind and has clearly done a lot of research) there is the following footnote on p. 195:

          An apocryphal story that I have for years included in this history on my web site was that a problem with the Apple III was heat production, which caused chips on the motherboard to come lose, requiring the computer be lifted a few inches and dropped to resolve. However, in an unpublished interview Mike Maginnis conducted with Wendel Sander in 2012, the actual problem was simply the memory board connector. Replacing this connector was the solution to the computer’s problems. However, picking up and dropping the computer probably also likely temporarily resolved the memory connector problem.

          • Jimmy Maher

            February 25, 2014 at 6:53 am

            Thanks! I don’t doubt Steven Weyhrich’s authority for a moment, but I still wish I had a stronger citation, especially as the claim of the failures being down to a “memory connector” would appear to be either different from or a simplification of Ian’s claim that they were down to a more general problem with the motherboard manufacturing process. For now anyway, edited the article to state definitively that this is all currently quite undefinitive.

          • Anonymous

            February 25, 2014 at 10:49 am

            The AppleLogic website also looks into the cause of the problems, and it looks as though they found it to be problems with the manufacturing process too.


            I don’t know; I’ve never used a III and only rarely seen them.

  6. Anonymous

    February 21, 2014 at 12:32 pm

    You’ve misspelled Jobs’ name as “Job’s” at least once. Also McIntosh isn’t a brand of apple, it’s a variety or a type. A brand of apple would be a name that a particular grower (or group) used for their specific crop of a variety with a more generic name, kind of like how there are Levi’s-brand blue jeans.

    Also, I wouldn’t say that the Mac was the first machine to use a 68000 chip — the Lisa is an obvious predecessor, and it had been around and in use for a few years. The first Sun workstations used it too. And draggable overlapping windows were first seen (at least first shipped) on the Xerox Star years before the Mac came out, IIRC. (I don’t remember if it had drop down menus, it tended to use buttons and dialogs a lot) People usually forget about the Star because it was not as legendary as the Alto and was rarely encountered unlike the Mac, but it did ship and was astoundingly advanced. There are some good videos of Star demos on YouTube which are worth taking a look at.

    Also you mention the dreaded disk swap tango. It was a bug (apparently a very long-lasting one) for which Steve Capps takes the blame, here:

    • Jimmy Maher

      February 21, 2014 at 12:56 pm


      I’ve definitely seen overlapping windows in the Star demonstrations I’ve seen, but never saw one actually being dragged. Nor does their look tend to imply a drag bar. Totally happy to revise that if you or anyone else can point to some definite evidence for windows as draggable entities on the Star, however. For obvious reasons, I’ve never actually used one. :)

      As far as 68000-based machines go, those you mentioned were all workstation-class machines costing $10,000 or more. The Mac, while not exactly cheap, was a consumer-grade computer marketed to individuals as well as businesses. That really puts it in a different class with different concerns and priorities and gives it excellent claim to the title of first 68000-based *PC* in my opinion.

      • Anonymous

        February 22, 2014 at 2:43 pm

        Okay, I have been refreshing my memory by checking out a Star demo as I too have not had the good fortune to use one:

        Turns out the Star did have pull down menus (at about 10:00 and 16:30), and double clicking (at about 11:00) but I’ve yet to see draggable windows. I’ll keep an eye out for it.

        • Jimmy Maher

          February 22, 2014 at 9:55 pm

          Thanks for this. Steven Levy strong implies at least that credit for the double-click should go to to Apple in Insanely Great, something that’s obviously incorrect in light of the demo:

          “The word processing program at Xerox had used double-clicks to select words, but the Lisa group used that function for other things as well, particularly for launching applications, and it was further refined on Macintosh.”

          The “pull-down” menus in the demo to which you linked are a bit more of a borderline case in my opinion. They’re really pop-up menus stuck up in the corners of the windows, without the neat headings to which we’re accustomed today. Definitely not as refined; I’d call them a step on the road to what we saw with Macintosh, but not quite far enough along to receive full credit.

          There’s a tremendous amount of confusion on this topic of just who did what in inventing the modern GUI, which is strange when you consider how exhaustively the work at Xerox and Apple has been written about. There’s also a lot of agendas flying around, with Apple zealots eager to credit every possible innovation to Apple and Apple haters just as eager to pronounce the Mac nothing but a rip-off of the Alto and/or Star.

          I would love to sit down in a room with an Alto, a Star, a Lisa, and a first-generation Mac and sort all of this out once and for all. I’m not sure anyone has ever actually done that…

  7. IPadCary

    February 24, 2014 at 9:49 pm

    *rubs hands gleefully*
    NOW we get to the heart of the matter, because, after all,
    the Golden Age Of Gaming is, of course, the 16-bit era & it’s first computer is, of course, the Macintosh & it’s first game is, of course, “Alice”.
    For somebody who’s not at all an Apple guy, you sure did good on this article.

  8. Brandon Campbell

    November 11, 2014 at 1:44 am

    One of the earlier comments mentioned HyperCard… I’d love to see an article about that! I didn’t have a Mac when it came out, but remember reading about it in magazines and getting really excited about it, only to find it had fallen by the wayside by the time I finally got my first Mac (actually one of the short-lived Power Computing clones, which I’ll never forgive Steve Jobs for pulling the plug on a couple years later) in 1995.

    • Jimmy Maher

      November 11, 2014 at 7:04 am

      HyperCard is a very important step on the road to HTML and the World Wide Web. So, yes, it will get its due when the time comes…

  9. Jason Kankiewicz

    January 6, 2017 at 4:25 pm

    Shouldn’t the phrase “while he was it” be “while he was at it”?

    • Jimmy Maher

      January 6, 2017 at 4:49 pm


  10. DZ-Jay

    February 25, 2017 at 1:00 pm

    By the way, I’d like to thank you for giving Apple their due credit and for placing the Macintosh in its deserved place in the history of computing.

    The Macintosh did what not even the Xerox PARC team could do: convince the world that this new paradigm of user interaction with computers was the future, and that the future was NOW.

    These sort of articles tend to skew one way or another due to the polarizing figure of Jobs and the apocryphal lore tied to the origins of the GUI. I applaud you for your efforts in offering a well researched, well reasoned, and well balanced account.



Leave a Reply

Your email address will not be published.