Monthly Archives: February 2014



In the Macintosh software artists confronted that rarest of things, a completely new canvas. It wasn’t just a case of the Mac being better than the PCs that had come before; they’d had plenty of experience already dealing with that. No, the Mac was not so much better as fundamentally different. For all the possibilities opened up by the Mac’s mouse, its toolbox of GUI widgets accessible by any program, its crisp high-resolution screen, and its ability to make practical use of sampled sound recorded from the real world, there were also lots of caveats and restrictions. The black-and-white display and the lack of handy joysticks, not to mention the lack of obvious ways to get out of the windows-and-mouse paradigm, meant that many or most existing games would make little sense on the Mac. All Mac software, games included, would have to strike off in entirely new directions rather than building on the stuff that was already out there. That, of course, was very much how Steve Jobs and company had intended things to be on their paradigm-shifting creation. The original Mac team has mentioned almost to a person how excited they were at the launch to see what people would make with Macintosh, what they could do with this new set of tools. Game programmers were as eager as anyone to take up the challenge.

And some of them were to be found right there at Apple. Indeed, the Mac’s first great game far predates the launch. Like so much else on the Mac, it was born on the Lisa.

Through the Looking Glass, née Alice

Through the Looking Glass, née Alice

At some point in the middle stages of the Lisa’s long gestation, a programmer specializing in printer interfacing named Steve Capps started tinkering in his spare time with Alice, a game inspired by the chess motif running through Lewis Carroll’s Through the Looking Glass. The player moved a piece representing Alice in real time around a chess board which was laid out in a striking 3D perspective, trying to stomp on all of the opposing pieces before they stomped on her. It was a simple concept, but, what with the player being subject to the normal movement rules of whatever chess piece she chose to play as in the beginning, one admitting of surprising depth. None other than the Lisa team’s head of systems programming, Bruce Daniels, introduced the Mac people to Alice. With the affable Daniels acting as intermediary, Capps soon received a Mac prototype along with the Mac team’s heartfelt request that he port Alice to it, a request to which he quickly acceded. It made a better fit to the Lisa’s more playful younger brother anyway, and, thanks to the Mac’s 3 extra MHz of clock speed, even ran more smoothly.

Alice became an obsession of the Mac team, with marketer Joanna Hoffman a particular devotee. She complained constantly that the game was too easy, prompting the obliging Capps to tweak it to increase the challenge. As Capps himself has since acknowledged, this probably wasn’t all to the good; the game that would eventually see commercial release is extremely challenging. Other suggestions, like the one from Steve Wozniak that the mouse cursor should shrink as it moved “deeper” into the board to emphasize the 3D perspective, were perhaps more productive. Steve Jobs showed little interest in the game itself (one of the many constants running through his career is an almost complete disinterest in games), but was very intrigued by the programming talent it demonstrated. Alice became Capps’s ticket to the Mac team in January of 1983, where he did stellar work on the Finder and other critical parts of the first version of MacOS.

As the big launch approached, Capps was understandably eager to explore the commercial potential of this game that had entranced so many of his colleagues. Trip Hawkins, who had continued to stay in touch with goings-on inside Apple even after resigning from the Lisa team, was sniffing around with proposals to release Alice under the Electronic Arts aegis, with whose accessible-but-arty early lineup it would have made an excellent fit. Steve Jobs, however, had other ideas. Feeling that the game should come out under Apple’s own imprint, he delivered a classically Jobsian carrot — that Apple would do an excellent job packaging and promoting the game — and stick — that, since Alice had been created by an Apple employee on the Apple campus using prototype Apple hardware and proprietary Apple software, it was far from clear that the game belonged to said employee in the first place, and legal trouble might just be the result if Capps decided to assume it did. And so Capps agreed to allow his game to become the first and only such that Apple themselves would ever release for the Mac.

The Through the Looking Glass package

The Through the Looking Glass package

The discovery of a database application already trading under the name of “Alice” necessitated a name change to the less satisfactory Through the Looking Glass. But otherwise Apple’s packaging of the game, made to look like an original edition of the novel that had inspired it — albeit one sporting a hidden Dead Kennedys logo, a tribute to Capps’s favorite band — was beautiful and perfect. EA couldn’t have done any better.

The marketing, though, was another story. Through the Looking Glass became a victim of Apple’s determination in the wake of the Lisa’s failure to reposition the Mac as their serious business computer, to shove the fun aspects of the machine under the carpet as something shameful and dangerous. Thus Capps’s game got nary a mention in Apple’s voluminous advertising that first year, and mostly languished as a dusty curiosity on dealers’ shelves. The game has gone on to become something of a cult classic as well as a treasured piece of Macintosh lore, but Trip Hawkins would doubtless have done a much better job of actually selling the thing.

Others also had good reason to be frustrated with Apple’s fear of fun. Infocom received a visit from Guy Kawasaki, today the most famous of all Apple’s early “Mac evangelists,” well before the Mac’s launch. In the words of Dan Horn, head of Infocom’s Micro Group, Kawasaki “begged” Infocom to get their games onto the Mac, and delivered several prototypes to make it happen. It turned out to be unexpectedly challenging. The pre-release version of MacOS that Infocom received with the prototypes was so buggy that they finally decided to throw it out altogether. They wrote their own simple window and menu manager instead, packaging it onto self-booting disks that dumped the player straight into the game. When the Mac debuted, Infocom’s catalog of ten games represented something like 50% of the machine’s extant software base. But by now the winds of change had blown at Apple, and Infocom couldn’t get Kawasaki or anyone else to even return their phone calls. No matter; Mac early adopters were a more accepting lot than much of Apple’s executive wing. Infocom did quite well on the Macintosh, especially in those earliest days when, Through the Looking Glass and a bare few others excepted, their games were the only ones in town.

Ultima III on the Mac

Ultima III on the Mac

Still, Infocom was hardly the only gaming veteran to test the Macintosh waters. Sierra and Origin Systems demonstrated how pointless it could be to try to force old paradigms into new via their ports of, respectively, Ultima II and III to the Mac. The latter is a particular lowlight, with Ultima‘s traditional alphabet soup of single-letter commands just jumbled into a couple of long menus helpfully labeled “A-M” and “N-Z.” Thankfully, most either did original work or took a lot more care to make their ports feel like native-born citizens of the Macintosh.

Sargon III on the Mac

Sargon III on the Mac

Dan and Kathleen Spracklen, creators of the long-lived Sargon line of chess programs, ported the latest iteration Sargon III to the Mac complete with a new mouse-based interface and absolutely loads of learning aids and convenience features hanging from its menus. None other than Bill Atkinson, architect of QuickDraw and MacPaint, paused to note how the Mac version of Sargon III changed his very concept of what a chess program was, from an opponent to be cowed to something more positive and friendly, like the Mac itself.

I have to set Sargon III on the easy level. The challenge used to be seeing if the computer could beat you. The challenge now is for the computer to teach you, by leading you, giving you hints, letting you take back moves.

Bill Budge ported Pinball Construction Set, the program whose GUI interface presaged a largely Mac-inspired revolution in games when it appeared on the Apple II, to the Mac itself. As he himself noted, however, what was revolutionary on the Apple II was “just another program” on the Mac. Still, the Mac Pinball Construction Set did let you load your MacPaint pictures in as fodder for your custom pinball tables, a demonstration of one of the less immediately obvious parts of the new Mac Way: its emphasis on crafting applications that cooperate and complement rather than compete with one another.

Bill Atkinson's MacPaint

Bill Atkinson’s MacPaint

Bill Budge's MousePaint

Bill Budge’s MousePaint

Budge also went the other way, creating what amounted to an Apple II port of MacPaint called MousePaint that copied the original right down to the little Apple logo in the upper left of the menu bar. Packaged with Apple’s first mouse for the II line, MousePaint is one of the more obvious examples of the impact the Mac was already having on more modest platforms. (Budge also claimed to be working on a space simulation, but, like his vaunted Construction Set Construction Set and so much else during his years in the wilderness, it would never see the light of day.)

Much other early Mac entertainment also evinced the Pinball Construction Set approach of giving you ways to make your own fun, an ethos very much in keeping with that of the machine itself. MasterPieces, for instance, let you carve your MacPaint drawings up into jigsaw puzzles, while MacMatch let you use them to create matched-pair puzzles like the old game show Concentration. Still other programs weren’t technically games at all, but no less entertaining for it: things like Animation Toolkit; MusicWorks, which made the first spectacular use of the Mac’s four-voice sound capabilities; HumanForms, which let you make people, Mr. Potato Head-style, out of assorted body parts. Defender clones may have been in short supply on the Mac, but this heady, intellectual stripe of playfulness was everywhere by the time the machine entered its troubled second year. Thus Balance of Power felt like a perfect fit when it arrived that summer.

A magazine-published screenshot of the lost original Balance of Power

A magazine-published screenshot of the lost original Balance of Power

A creation of programmer, designer, writer, theorist, and industry gadfly Chris Crawford, Balance of Power is an ambitious geopolitical simulation of the contemporary world circa 1985. It places you in charge of either the United States or the Soviet Union, seeking to extend your sphere of influence over as many as possible of the sixty other countries in the game in a high-stakes game of Cold War brinksmanship. It’s a grandiose concept indeed, and becomes even more so when you consider the sheer amount of information Crawford has packed in — stuff such as number of physicians per million people, average daily caloric intake, and average school enrollment for each country. Not only would earlier machines have drowned under such a tsunami of data, but making it accessible and relatable would also have been nearly impossible. In Balance of Power, it’s all organized into neat menus and windows, as fine an example of the Mac’s ability to make information visually immediate and relevant as anything that came out those first couple of years. Before too long all grand strategy games would be like this.

Significant as it is as a waystation on the road to Civilization, Balance of Power is also a huge landmark of the serious-games movement. Simply put, this game has a rhetorical agenda. Boy, does it have an agenda. Pushing your opponent too far results in nuclear war, and the most famous piece of text Crawford has ever written.

You have ignited a nuclear war. And no, there is no animated display of a mushroom cloud with parts of bodies flying through the air. We do not reward failure.

It’s as powerful a statement now as then on not only the foolishness of jingoist brinksmanship but also on the seemingly perpetual adolescence of much of the mainstream games industry. Yet, and speaking here as someone who is quite sympathetic to Crawford’s agenda on both counts, it’s also kind of disingenuous and unfair and, well, just kind of cheap.

The problem here is that the game simply assumes bad faith on my part, that I’ve touched off a nuclear war so I can see body parts and mushroom clouds. In actuality, however, the body-parts-and-mushroom-clouds crowd is highly unlikely to have ever gotten this far with the cerebral exercise that is Balance of Power. It’s more likely that I’ve tried to play the game within the rules Crawford has given me and simply failed, simply pushed a bit too hard. It’s important to note here that playing within Crawford’s rules requires that I engage in brinksmanship; I can win only by pushing my luck, aggressively trying to spread my political agenda through as much of the world as possible at my fellow superpower’s expense so that I can end up with more “prestige points” than them. There is neither a reward nor any real mechanism for engendering détente and with it a safer world. Given that vacuum, I don’t really like being scolded for playing the game the only way that gives me any hope of success on the game’s own terms. To compound the problem, it’s often all but impossible to figure out how close your opponent actually is to the proverbial big red button, hard to know whether, say, Indonesia is really considered worth going to war over or not. Nuclear war, when it comes, can seem almost random, arising from a seemingly innocuous exchange after half a dozen computerized Cuban Missile Crises have passed harmlessly. There may arguably be a certain amount of rhetorical truth to that, but it hardly makes for a satisfying game. Perhaps more attention paid to presenting a real picture of the state of mind of your opponent and less to that mountain of 95% useless statistics could have helped — an ironic complaint to make about a game by Chris Crawford, coiner of the term “process intensity” and perpetual complainer about the prevalence of static data as opposed to interactive code in modern games.

I don’t want to belabor this too much more lest our real purpose here get entirely derailed, but will just note that Balance of Power falls into the trap of too many serious games to come as well as too many of Crawford’s own games in simply being not much fun to play. Crawford would doubtless simultaneously agree with and dismiss my complaints as a product of a body-parts-and-mushroom-clouds sensibility while noting that he aspires to something higher than mere fun. Which is fair enough, but I tend to feel that for a game to achieve any other rhetorical goal it must be engrossing in a way that Balance of Power just isn’t. Anyway, everything of ultimate note that it has to tell us about geopolitics is contained in the quote above. If like in the movie War Games the only way to win is not to play, why charge people $50 for the non-experience? Suffice to say that, like plenty of other works I’ve written about on this blog, Balance of Power garners historical importance and even a certain nobility simply for existing when it did and trying the things it did.

I want to end this little survey today with a less rarefied game that’s of at least equal historical importance. It’s the product of a small Chicago-area company called ICOM Simulations which had already been kicking around the industry for a few years under the name of TMQ Software. Formed by Tod Zipnick in 1981, TMQ’s most ambitious pre-Mac product had been File-Fax, a database manager for the Apple II that garnered a positive review or two but few sales. Other than that, they’d mostly specialized in doing action-game ports to various platforms for the likes of Atarisoft, Coleco, and even EA. When the Mac arrived, they figured their odds of making a splash with original games in that new ecosystem were far better than they were on the congested likes of the Apple II.

Déjà Vu. Note the multiple layers of containment.

Déjà Vu. Note the multiple layers of containment.

ICOM’s big idea was to translate the traditional parser-driven adventure game into the new visual paradigm of the Mac. The goal was essentially to do for the adventure what MacOS had done for the command-line-driven operating systems that preceded it, in pretty much exactly the same ways. The underlying world model of the MacVenture engine is that of a text adventure, divided into discrete interconnected rooms which can contain other objects with their own unique properties, including the one representing you the player. In a MacVenture, however, you interact with objects not by typing sentences but by constructing them visually, tapping one of eight verbs and an object to go with it — whether something in the room that you see represented graphically before you, something in your inventory (also represented as a set of draggable pictographs), an exit, or just your “Self.” You can add an indirect object by “OPERATING” one object (first click) on another (second click). You can pick up an object in the room just by dragging it to your inventory; drop it by dragging it back into the room. Objects can and often do contain other objects: you can “OPEN” the trench coat in your inventory to open a window showing you what’s in its pockets, “OPEN” the wallet you find there to open still another window with its contents, and so on down the hierarchy tree.

In the fall of 1985, when the first MacVenture debuted in the form of a two-fisted private-eye caper called Déjà Vu, it was an absolute stunner, the sort of thing that could stop people in their tracks when they stumbled across it running on an in-store computer. And it’s still a fine interface, very intuitive and, a few quibbles about clutter resulting from the small screens of its era aside, very practical and enjoyable today.

It’s all too typical in the industry for a game with the shiny technical innovations of Déjà Vu to coast on them, for the actual design inside the engine to be little more than a tech demo. Nor is ICOM’s pedigree as a collection of hardcore programmer’s programmers all that comforting. I thus didn’t expect to think too much of Déjà Vu as a game when I played it for this article. I must say, though, that ICOM surprised me there.

Déjà Vu begins on December 7, 1941(!), when you wake up in a bathroom stall inside a deserted bar with no memory of who you are or how you got there or who or what you emptied three shots from your revolver into or why you seem to have a veritable cocktail of drugs flowing through your veins. Yes, amnesia is a cliché premise in adventure games, not least because it’s so damn convenient for a genre that’s really good at exploration and backstory but usually not so good at here-and-now plotting. Yet it can also be a compelling premise, in mystery fiction as well as games, and it works here. The mystery of who you are and how you got to that bathroom stall is intriguing, its unraveling compelling, with complications like the dead mobster that you soon also find in the bar (with three of your bullets in him, naturally) coming thick and fast. In contrast to so many games of its era, Déjà Vu is also pretty solvable. Oh, it’s very old school, with an unforgiving time limit — the drugs in your system will eventually kill you if you can’t find the antidote — and the occasional random death. You’ll need to save early and often and plan your forays carefully. Yet if you’re willing to do that you’ll find you can probably crack the case pretty much unassisted, and have a pretty good time doing it.

Déjà Vu doesn’t take itself all that seriously, but it doesn’t treat its whole premise as just a breeding ground for jokes either. As a relatively coherent work of fiction, it stands amongst the top tier of 1980s adventure games. The jokes that are there mostly fit to the setting and are, shocker of shockers, genuinely funny as often as not. Much of the humor pokes fun at the protagonist, hardly unusual for early adventure games, but it doesn’t feel so personally insulting here because the game does a good enough job with characterization that you actually feel it to be sneering at the character you’re playing rather than you personally. About the only unfortunate aspect is an ugly series of juvenile jokes about an overweight woman, the sort of thing that can trigger a mild epiphany today about just how much certain social mores have changed — and, mind you, very much for the better — in the last thirty years.

Credit for Déjà Vu‘s surprisingly satisfying design largely goes to Craig Erickson. The story behind it was written by Kurt Nelson, Mark Waterman did the visuals, and Darin Adler, Steve Hays, and Todd Squires were the technical architects of the engine itself. Like Balance of Power, Déjà Vu was published by Mindscape, a company dating like EA from the big second wave of publishers and which, also like EA, was publishing some of the most interesting and audacious games in the industry during the mid-1980s. (That said, ICOM fell in with Mindscape largely out of convenience, because they were literally right down the road in an adjacent suburb of Chicago.) And also like Balance of Power, Déjà Vu was a hit by the modest standards of the early Macintosh software market, the big breakthrough that ICOM had been seeking for years. Tod Zipnick soon put his programmers to good use porting the MacVenture engine to other platforms, including not only the Mac’s mice-and-windows-and-68000-based competitors the Atari ST and Commodore Amiga but also the likes of the IBM PC, the Commodore 64, eventually even (in ports done by the Japanese company Kemco) the Nintendo Entertainment System — yet another sign of the importance of the Mac not just as a platform but as a concept and an engine of innovation.

ICOM has tended to be overlooked in histories of the graphic adventure, which mostly dwell on Sierra (whose King’s Quest debuted the year before Déjà Vu) and LucasArts (whose Maniac Mansion debuted two years after). In truth, however, the MacVenture engine is at least as important as Sierra’s AGI or LucasArts’s SCUMM engines. While King’s Quest is a deserved landmark simply for mixing interactive graphics with adventure at all, the AGI engine is also something of an evolutionary dead end with some fairly intractable problems, most notably that of trying to translate the objects you see graphically on the screen into words the parser will understand. LucasArts’s innovations, meanwhile, are more formal than technical, a declaration that it is possible to write challenging, enjoyable graphic adventures without random deaths, unforeseeable dead ends, and incomprehensible puzzles. The actual interface mechanics of the early LucasArts games are essentially a hybrid of AGI and MacVenture that is more playable than the former but not quite so slick as the latter. Déjà Vu gave its players in 1985 a preview of what virtually all commercial adventure games would be like in five or seven years. For a fan of prose and parsers like me and presumably many of you, that makes its debut something of a bittersweet moment, representing as it does one more huge nail in the slowly building coffin of the commercial text adventure. But such is progress.

Three more MacVenture games followed Déjà Vu, one of them a direct sequel. We’ll revisit ICOM at some future date to talk more about them, as we also will the ongoing cottage industry that was Mac software in general. In the meantime, you can play Déjà Vu and all of the other MacVentures online courtesy of Sean Kasun.

Their days may be numbered, but there’s still plenty to be written about the prose-and-parser people as well. We’ll take up that thread again next time, when we start to look at yet another of Infocom’s would-be challengers.

(Significant magazine sources: Electronic Games of March 1985; Byte of March 1986; Family Computing of April 1986. Jason Scott’s interviews with Steve Meretzky and Dan Horn for Get Lamp were invaluable as always; thanks, Jason! See a retrospective by Tom Chick for another take on Balance of Power. The picture that opens this article was taken from the March 1985 Electronic Games, who I wish had lasted longer; in addition to great art that I love to steal, the magazine had an unusually thoughtful editorial voice.)


Posted by on February 28, 2014 in Digital Antiquaria, Interactive Fiction


Tags: , , ,


The Apple Macintosh followed a long and winding road to join Steve Jobs onstage in front of a cheering throng at De Anza College’s Flint Auditorium on January 24, 1984. It was never even a particular priority of its parent company until, all other options being exhausted, it suddenly had to be. But once it finally was let out of its bag it became, just as its father predicted, the computer that changed everything.

Jobs wasn’t even the first father the Mac knew. It had originally been conceived almost five years earlier by another dreamer, digital utopianist, and early Apple employee named Jef Raskin who believed he could save the world — or at least make it a better place — if he could just build the Dynabook.

The brain child of still another dreamer and visionary named Alan Kay, who first began to write and speak of it in the very early days of Xerox PARC, the Dynabook was more thought experiment than realistic proposal — a conception, an aspirational vision of what could one day be. Kay called it “a dynamic media for creative thought”:

Imagine having your own self-contained knowledge manipulator in a portable package the size and shape of an ordinary notebook. Suppose it had enough power to outrace your senses of sight and hearing, enough capacity to store for later retrieval thousands of page-equivalents of reference materials, poems, letters, recipes, records, drawings, animations, musical scores, waveforms, dynamic simulations, and anything else you would like to remember and change.

The Dynabook was a tall order in light of the realities of 1970s computer technology. Indeed, nothing that came remotely close would actually appear for another two decades at least. As Kay himself once put it, thinkers generally fall into two categories: the da Vincis who sketch away like mad and spin out a dozen impractical ideas before breakfast upon which later generations can build careers and obsessions; and the Michelangelos who tackle huge but ultimately practical projects and get them done. Kay was a da Vinci to the bone. The PARC researchers dubbed the less fanciful workstation they built to be their primary engine of innovation for the time being, the Alto, the “interim Dynabook.”

Michael Scott, Steve Jobs, Jef Raskin, Chris Espinosa, and Steve Wozniak circa 1977

Michael Scott, Steve Jobs, Jef Raskin, Chris Espinosa, and Steve Wozniak circa 1977

Much later in the decade, Raskin thought he might advance the cause a bit more with an interim Dynabook of his own. He thought even the much-loved Apple II was too complicated, too difficult and fiddly, too aesthetically unpleasant, too big to ever play an important role in anyone’s life who was more interested in what she could do with a computer than the computer as an end in itself. He therefore pitched to the executives at Apple his idea for a relatively cheap (about $1000) and portable computer that, far from being the hardware hacker’s playground that was the Apple II, would be a sealed, finished piece — the only one you had to buy to start expressing yourself digitally. Even all the software you’d need would come built right in. Believing that the standard industry practice of naming prototypes after women (as often as not the prettiest secretary in the office) was sexist, he decided to call his idea Macintosh, after his favorite type of (edible) apples, the McIntosh.

In many ways Raskin’s idea cut directly against the grain of Apple’s corporate strategy, which was to further penetrate the business market, in the short term via the Apple III and in the long via the Lisa; both projects were already underway, although the latter was in nothing like the form it would eventually assume. While Apple was trying to trade in their bellbottoms for three-piece suits, Raskin was still living the hippie dream of bringing power to the people. “If I wanted to work for a business company, I’d join IBM,” he told Apple’s president Mike Scott. Still, the company was booming and an IPO was already visible on the horizon. There was enough money and enough hippie utopianism still hanging about the place to let Raskin and a few others tinker with his project.

The Macintosh project during its first eighteen months rarely had a staff of more than four, and often less than that; Raskin had to fight for scraps. Sometimes that worked out just as well; a key acquisition was Burrell Smith, a talented hardware engineer he rescued from a job as a lowly service technician, testing and repairing Apple IIs that had come back to the company under warranty. Smith became the Mac’s hardware guru, a position he would continue to hold right up through the launch and some time beyond, giving him by far the longest tenure of any member of the original team. Given his price window, Smith couldn’t afford to design anything that would be much more powerful than the Apple II; the first prototype was built around an 8-bit Motorola 6809 no more powerful than the Apple II’s 6502, and had just 64 K of memory. It did, however, use a relatively high-resolution bitmapped display in lieu of the Apple II’s text. Although he was oddly unenamored with mice and windows, this part at least of the Xerox PARC gospel had reached Raskin loud and clear.

With Raskin himself often not seeming sure what he wanted and what was doable and many of his staff not seeming overly interested in buckling down to work on his schemes, the project languished through most of 1980. On one or two occasions it was actually cancelled, only to be revived in response to Raskin’s impassioned pleas. Yet practical progress was hard to see. Raskin mostly busied himself with The Book of Macintosh, a sort of aspirational bible hardly more practical than Kay’s original dream of the Dynabook. Then Steve Jobs read The Book of Macintosh and promptly came in and took his computer away from him.

Jobs was a huge headache for Michael Scott, Mike Markkula, and the rest of Apple’s senior leadership, who received memos almost daily complaining about his temper, his dismissive attitude toward the Apple II platform that was the only thing supporting the company, and his refusal to listen to reason when one of his sacred precepts was threatened. Jobs’s headstrong authoritarianism had been a big contributor to the debacle that was the Apple III launch. (Traditional wisdom, as well as an earlier version of this article, would have it that Jobs’s insistence that the Apple III ship without a cooling fan led directly to the hardware problems that left Apple IIIs dying on buyers’ desks by the thousands. It does, however, appear that this version of events is at least questionable; see the comments section for more about that. Be that as it may, everyone involved would agree that Jobs did an already muddled project no favors.) The Apple III never recovered, and would pass into history as Apple’s first flop. Now he was sowing the same chaos within the Lisa project, a computer the company simply couldn’t afford to let go the same way as the Apple III. Scott and Markkula forcibly removed him from Lisa in late 1980. They would have liked for him to just content himself with enjoying his post-IPO millions and accepting the occasional medal at the White House as a symbol of the American entrepreneurial spirit while they got on with actually running his company for him. They would have liked, in other words, for Jobs to be like Wozniak, who dipped in and out of the occasional engineering project but mostly was happy to spend his time organizing rock festivals and finishing his education and learning to fly an airplane and generally having all the good times he’d missed during a youth spent with his head buried in circuit boards. Jobs, alas, was not so pliable. He wanted an active role at what was after all still in some moral sense his company. Trouble was, every time he took an active role in anything at all anger and failure followed. Thus his forcible eviction from Lisa while it still looked salvageable. But at the same time Apple certainly couldn’t afford an ugly break with their founder and entrepreneurial golden boy. When a hurt Jobs started to lick his wounds from Lisa not through ugly public recriminations but by interesting himself in Raskin’s strictly small-time Macintosh project, the executives therefore took it as very good news. Let him tinker and meddle to his heart’s content with that little vanity project.

But Jobs’s interest was very bad news for one Jef Raskin. Never really technical himself, Jobs nevertheless knew very well how technical people thought. He innocently suggested to Burrell Smith that he might dump the plebeian old Motorola 6809 in favor of the sexy new 68000 that the Lisa people were using, and double the Mac’s memory to 128 K while he was at it. That was an offer no hardware hacker could resist. With Smith successfully subverted, it was just a matter of time. Raskin wrote furious memos to upper management about Jobs’s unauthorized takeover of his project, but they fell on predictably deaf ears. Instead, in early 1981 the takeover was made official. Jobs condescendingly offered Raskin the opportunity to stay with the Macintosh in the role of technical writer. Raskin, who by all indications had an ego almost as big as Jobs’s own, refused indignantly and walked out. He never forgave Jobs for co-opting his vision and stealing his project, remaining convinced until his death in 2005 that his Macintosh would have been better for Apple and better for the world than Jobs’s.

For all that the project had been in existence for over eighteen months already, there was very little really to Macintosh at the time of the takeover — just Raskin’s voluminous writings and some crude hardware based on an obsolete chip that resoundingly failed to live up to the visions expressed in The Book of Macintosh. Thus one could say that the real story of the Macintosh, the story of the machine that Jobs would finally unveil in January of 1984, begins here. Which is not to say that Jobs discarded Raskin’s vision entirely; he had after all been originally drawn to the project by the ideas inside The Book of Macintosh. Although the $1000 goal would be quietly dropped in fairly short order, the new machine should nevertheless be inexpensive at least in comparison to the Lisa, should stress elegance and simplicity and the needs of everyday non-computer people above all else. Jobs, however, shared none of Raskin’s skepticism about mice and menus. He had bought the GUI religion hook, line, and sinker, and intended the graphical user interface to be every bit as integral to the Macintosh as it was to the Lisa. Hell, if he could find a way to make it more so he’d do that too.

Much of the original Mac team: Bill Atkinson, Andy Hertzfeld, Chris Espinosa, George Crow, Joanna Hoffman, Burrell Smith, and Jerry Manock. Taking a leaf from Electronic Arts's playbook, Apple photographed them often in artful poses like this one during the Mac's initial promotional push.

Much of the original Mac team: Bill Atkinson, Andy Hertzfeld, Chris Espinosa, George Crow, Joanna Hoffman, Burrell Smith, and Jerry Manock. Taking a leaf from Electronic Arts’s playbook, Apple photographed them often in artful poses like this one during the Mac’s initial promotional push.

Still with pull within Apple the likes of which Raskin could only dream of, Jobs began assembling a group of stars to start over and make his version of Macintosh. Joining Smith the hardware guru were additional hardware engineer George Crow; programmers Andy Hertzfeld, Larry Kenyon, Chris Espinosa, Bruce Horn, Steve Capps, Bud Tribble, and Bill Atkinson; industrial designer Jerry Manock to shape the external look and feel of the machine; Susan Kare to shape the internal look and feel as designer of graphics, icons, and fonts; and Joanna Hoffman as writer, marketer, and the team’s face to the outside world, the first “Mac evangelist.” Jobs even briefly recruited Wozniak, but the latter found it hard to stay focused on the Mac, as he would just about every other project after his Apple II masterpieces, and soon wandered off again. Others would come and go, but the names listed above were the core of the team that would, just as Jobs so often promised them was inevitable, change the world.

Jobs deliberately fostered an “us against the world” mentality, with the world in this case apparently including the rest of Apple — particularly the much larger and more bureaucratic Lisa team. His dictum that “It’s better to be pirates than to join the Navy” shaped the Mac team’s conception of itself as a brilliant little band of rebels out to make a better world for everyone. They even took to flying a skull-and-crossbones flag outside their offices on the Apple campus. They were united by a sincere belief that the work they were doing mattered. “We all felt as though we had missed the civil-rights movement,” said one later. “We had missed Vietnam. What we had was Macintosh.” Their pranks and adventures have become computer-industry folklore (literally; Andy Hertzfeld’s longstanding website is full of them, and makes great reading).

Of course, one person’s genius at work is another’s self-entitled jerk. A joke was soon making the rounds at Apple:

How many Macintosh Division employees do you need to change a light bulb?

One. He holds the bulb up and lets the universe revolve around him.

Perhaps the people with the most justification for feeling aggrieved were those poor plodding pedants — in Jobs’s view, anyway — of the Lisa team. As Steve Capps would later put it, “A lot of people think we ripped off Xerox. But really we ripped off Lisa.”

To say that the Mac could not have existed without Lisa is in no way an overstatement. Mac was quite literally built on Lisa; for a long time the only way to program it was via one of the prototype Lisas installed in the team’s office. The Mac people watched everything the Lisa people did carefully, then reaped the fruit of whatever labor seemed useful to them. They happily digested the conclusions of the Lisa team’s exhaustive user testing of various designs and interfaces and built them into the Mac. They took Bill Atkinson’s QuickDraw, the core rendering layer at the base of the Lisa’s bitmapped display, for the Mac. Later, Jobs managed to take its programmer as well; in addition to QuickDraw, Atkinson became the author of the MacPaint application. Yes, Jobs proved surprisingly willing to borrow from the work of a team he dismissed as unimaginative plodders. The brilliance of the people involved is one answer to the question of how Macintosh was created by so few. Lisa, however, is another.

The Mac people regarded their leader with a combination of awe and bemused tolerance. It was team member Bud Tribble who coined perhaps the most famous of all descriptions for Jobs’s unique charisma, that of the “reality distortion field.” “In his presence,” noted Tribble, “reality is malleable. He can convince anyone of practically anything.” Tribble elaborated further on Jobs’s unique style:

Just because he tells you that something is awful or great, it doesn’t necessarily mean he’ll feel that way tomorrow. You have to low-pass filter his input. And then, he’s really funny about ideas. If you tell him a new idea, he’ll usually tell you that he thinks it’s stupid. But then, if he actually likes it, exactly one week later he’ll come back to you and propose your idea to you, as if he thought of it.

The aforementioned reality distortion field kept this sort of behavior from seeming as obnoxious as it would have from just about anyone else. Anyway, everyone was well aware that it was only because of Jobs’s patronage that the Mac project was tolerated at all at Apple. This little group of pirates, convinced that what they were doing was indeed (to choose another of Jobs’s catchphrases) “insanely great,” something that would change the world, knew that they owed the vision and the opportunity for Macintosh to Jobs. Atkinson later noted that “You only get one chance to change the world. Nothing else matters as much — you’ll have another chance to have vacations, have kids.” Most people, of course, don’t ever even get one chance. He and the rest of them owed theirs to Jobs.

Thankful as they were, they were hardly mindless disciples. They did their best to redirect his course when he got details as wrong as he got the big-picture vision right. When their reasoning failed, as it usually did with the imperious Jobs, they did their best to subvert him and/or to minimize the damage.

The list of bad decisions Jobs made about Macintosh is long, easily long enough to torpedo virtually any other computer. He insisted that the Mac use the same horrifically unreliable in-house-designed “Twiggy” disk drives as the Lisa, an example of borrowing a bit too much from Mac’s older sister. He rejected categorically pleas that the Mac at least have the option of memory expansion beyond 128 K, insisting that doing so would just encourage programming inefficiency and turn the Macintosh into a bloated monster like Lisa; his team’s arguments that a bitmapped, GUI-driven operating system running under a 16-bit processor required by its very nature vastly more memory than something like the Apple II got them nowhere. He rejected an internal hard drive because it would require that most hated of all pieces of technology, a noisy fan. He rejected a second internal floppy drive because there wouldn’t be room in Jerry Manock’s sleekly elegant case, plus bloat and all that. He tried to kill the Apple LaserWriter, a product that would prove almost as significant for the company as the Mac itself and without which the Mac may very well have not survived beyond its first couple of years. He cut short all discussion of networking by pulling out a floppy disk and pronouncing, “Here’s your network!” (The laser printer and Ethernet, those two other parts of the PARC gospel, had most resoundingly not reached Jobs during his famous visit.) He even refused to permit cursor keys on the keyboard, saying that the mouse was the only proper way to move the cursor in this new paradigm of computing.

The original Mac keyboard, complete with no cursor keys

The original Mac keyboard, complete with no cursor keys

People did what they could in the face of this litany. Burrell Smith made sure the Mac was capable of accommodating 3.5-inch floppy drives, the emerging industry standard soon to replace the older 5.25-inch floppies, as well as the Twiggy. When Lisa debuted a year ahead of the Mac and the Twiggy drives proved a disaster, the Mac manufacturing team was able to easily slot the 3.5-inch drives in in their place. (Taking the fall for Twiggy was another great service Lisa did Macintosh.) Everyone also made sure that the Mac was ready to accommodate more memory on both the hardware and software side, for when the realization finally dawned that 128 K just wasn’t going to cut it. (That realization began to dawn quite early even for Jobs; the machine he unveiled to press and public on January 24, 1984, had in fact been hacked to have 512 K. Otherwise the presentation would have been a less impressive one altogether, with a lot more time spent waiting for the Mac to deign to do something and none of the cool synthesized speech.) For most of the rest, there wasn’t much for it but to hope the machine did well enough with the early adopters that they could go back and fix the problems later. Cooler heads in management did at least prevail to save the LaserWriter.

On the hardware side, the Macintosh was smartly but minimalistically designed by Burrell Smith, a huge admirer of Steve Wozniak who strained to craft the same sort of elegant circuitry for the Mac that Woz had for the Apple II. For all that it was clean and compact, however, the Mac wasn’t terribly interesting or impressive as a piece of hardware. Jobs, from a contemporary interview in Byte magazine:

By paying a little more for the microprocessor, not only were we able to give the customer an infinitely more powerful chip than, say, an 8-bit chip or one of Intel’s baby micros, but we were able to pick up this amazing software [referring here to Bill Atkinson’s QuickDraw layer sourced from the Lisa project], and that allowed us to throw tons of chips out of this thing. We didn’t have to get special custom text or graphics chips. We just simplified the system down to where it’s just a bitmap on the screen, just Bill’s amazing software and Burrell’s amazing hardware, then in between that the other amazing software that we have. We tried to do this in every single way, with the disk and with the I/O…

The Macintosh, in other words, asks a heck of a lot of its 68000 CPU, something it could get away with because, well, it was a 68000, the most powerful reasonably priced chip in the industry of the time. A person reading that Byte interview might have asked what the 68000 could do with a bit more support in the hardware around it. That question would be answered in fairly resounding form by later 68000-based machines, most notably the Amiga, which could run rings around the Mac.

But of course that line of argument is a touch unfair, given that the Mac was the first PC in the world to be principally defined not by its hardware but by its software. The newly minted MacOS was a brilliant creation, one that went in many ways far beyond what its legendary predecessors at Xerox PARC had managed. Incredible as the Xerox Alto was, much of what what we have come to expect in our GUIs as a matter of course dates not from the Xerox of the 1970s but from the Apple of the early 1980s. Amongst these are such basic building blocks as pull-down menus and even the idea of windows as draggable entities that can overlap and be stacked atop one another; on the Alto they were non-overlapping tiles fixed in place (as they also were, incidentally, in the earliest versions of Microsoft Windows). One of Jobs’s favorite aphorisms during the final frantic year of Mac development was “Real Artists Ship!” This was something the tinkerers and theorists at PARC never quite managed to do. As anyone who’s ever finished a big creative project knows, the work of polishing and perfecting usually absorbs far more time and effort — and tedious, difficult effort at that — than hammering out the rough concept ever does. Apple did this heavy lifting, thus enshrining Xerox PARC as well as the Mac itself forever in computing legend. And they did it well — phenomenally well. I have my problems with Apple then and now, but this should never be forgotten.

As the Mac began to assume concrete form at the beginning of 1983, Jobs’s star at Apple was again in the ascendent. After years of muddled leadership from Michael Scott and Mike Markkula, the company had finally decided that a more dynamic leader was needed. Scott and Markkula had been Silicon Valley insiders steeped in engineering detail; Markkula had personally contributed code, testing, and documentation to the company’s early projects. To bring to fruition Jobs’s vision for Apple as a great mainstream company, known and loved by the masses, a very different sort of leader would be needed. Ideally, of course, that leader would be him, but Apple’s board wasn’t that crazy. As a second-best alternative, Jobs became enamored with a very unconventional choice indeed: a marketing expert and polished East Coast blue blood who was currently running the Pepsi brand. His name was John Sculley, and it was doubtful whether he even would know how to turn on one of Apple’s computers.

Steve Jobs and John Sculley at the Mac's public introduction on January 24, 1984.

Steve Jobs and John Sculley at the Mac’s public introduction on January 24, 1984.

Even had he never hooked up with Apple, Sculley’s name would be enshrined in business lore and MBA syllabi. Not yet 45 when Jobs’s courtship began, Sculley was already a decorated general of the Cola Wars. He had been one of the pioneers of what would come to be called “lifestyle advertising.” You know the sort of thing: all those advertisements that show cool, pretty people doing interesting things whilst listening to the hippest music and, oh, yes, just happening to enjoy a Pepsi while they’re about it. (“Come alive — you’re in the Pepsi Generation!”) “Boy,” thinks the consumer, “I’d like to be like those people.” And next time she’s at the grocery store, she picks up a six-pack of Pepsi. It sounds absurd, but, as one look at your television screen will tell you, it’s very, very effective. Very few of us are immune; I must sheepishly admit that I once bought a Volkswagen thanks largely to a certain advertisement featuring a certain Nick Drake song. As Mad Men has since taught all of us and Sculley grasped decades ago, the cleverest advertising doesn’t sell us a product; it sells us possibility. The best examples of the lifestyle form, like that Volkswagen spot, can be compelling and inspired and even beautiful.

If that wasn’t enough, Sculley was later instrumental to the most legendary Cola Wars campaign of all time, the Pepsi Challenge, which cleverly combined the lifestyle approach with the more conventional hard sell. The advertisements showed that it just happened to be the cool, attractive people — many of them hip young celebrities and athletes — who preferred the taste of Pepsi to that of Coke. The ads were everywhere, an inescapable part of the cultural landscape of the late 1970s and early 1980s. And, judging by the relative sales trends of Coke and Pepsi, they were very, very effective; for the root cause of the “New Coke” fiasco of the mid-1980s, look no further.

Now Jobs wanted Sculley to do the same thing for Apple, to craft for the company an identity that transcended the specifications sheets and price comparisons that sullied their competitors. To some extent Apple already enjoyed a special status; their compelling origin story and the charisma of their two young founders along with the engaging personality of their signature creation the Apple II gave them a cachet of which drabber, more conventional companies, executives, and computers could only dream. Now Jobs believed he and Sculley together could leverage that image to make an Apple computer the hippest lifestyle accessory of the 1980s. There was more than a little bit of utopian fervor to Jobs’s vision, part and parcel of that strange intermingling of hardheaded business ambition and counterculture idealism that has always seen Jobs and the company he founded selling a better world for a rather steep price. Jobs’s deal-closing pitch to Sculley, which may never have actually passed his lips in such pithy form, has nevertheless gone down into Apple lore: “Do you want to sell sugar water for the rest of your life, or do you want to come with me and change the world?” How could anyone refuse?

It became increasingly clear as 1983 wore on and Sculley settled into his new West Coast digs that the specific Apple computer that would be doing the world-changing must be the Macintosh. The Lisa was a flop, done in by intrinsic failings, like the unreliable Twiggy drives and its beautiful but molasses-slow GUI, and some extrinsic ones, like its high price and the uncertainty of big business — the only people who could realistically buy the thing — over what it really was good for. Nor did Jobs’s persistent whispers to reporters to just wait, that something cheaper and even better was coming soon, do the Lisa any favors.

Still, by many measures the Mac was not only cheaper but better than Lisa. Its 68000 architecture may have been unexceptional, but so was the Lisa’s — and the Mac’s 68000 was clocked at 8 MHz, a full 3 MHz faster than the Lisa’s. The Mac’s operating system was slim and lightweight, written in pure 68000 assembly language, as opposed to the Lisa’s bigger and more ambitious (overambitious?) operating system which was mostly written in Pascal. There was a price to be paid for the Mac’s slim efficiency; in some areas like multitasking and memory protection MacOS wouldn’t fully equal LisaOS until the arrival of OS X in 2001. But an average user just trying to get stuff done will make lots of compromises to have a snappy, usable interface — something which, at least in contrast to the Lisa, the Mac had in spades.

Condemned as a backwater project with little relevance to Apple’s business-centric corporate direction for years, as Macintosh geared up for the big launch Jobs and his band of pirates now found themselves taking center stage. Macintosh was now the future of Apple; Macintosh simply had to succeed. The last five years at Apple had been marked by the ever-greater success of the Apple II almost in spite of its parent company and two colossal and expensive failures to develop a viable successor to that beloved platform. Apple was still a major force in the PC industry, with yearly revenues approaching $1 billion. Yet they were also in a desperately precarious position, dependent as they still were on the archaic Apple II technology and their absurdly high profit margins on same. At some point people had to stop buying the Apple II, which was now thoroughly outclassed in some areas (notably graphics and sound) by competition like the Commodore 64 that cost a fraction of the price. With the Apple III and the Lisa lost causes, the Macintosh by default represented Apple’s last chance to field a viable bridge to the Apple II-less future that had to come one of these days. Given the age of the Apple II, it was highly doubtful whether they would have time to go back to the drawing board and create yet another new machine for yet another kick at the can. The Mac represented their third strike; it was Mac or bust. Steve Jobs and his team reveled in it and prepared to change the world.

The Macintosh was announced to the world on January 22, 1984. Early in the third quarter of Super Bowl XVIII and not long after one of IBM’s Charlie Chaplin spots for the ill-fated PCjr, an audience bloated with chips and beer and bored with a rather lackluster football game saw this, the most famous Super Bowl advertisement of all time.

Most people had no idea whatsoever what Apple was on about, had no idea that Big Brother represented the hated IBM who had taken the lead in business computing that Apple felt was rightfully theirs. The commercial was the talk of the media for the next few days, as everyone speculated about just what this “Macintosh” thing was and what it had to do with hammer-hurling freedom fighters. The advertisement, which it soon emerged had been directed by none other than that master of dystopia Ridley Scott of Alien and Blade Runner fame, would never darken a television screen again. No need; it had done its job, and would go down into history alongside Lyndon Johnson’s “Daisy” ad as one of the two most famous one-and-done commercials of all time.

The “1984” spot was an overheated, rather adolescent piece of rhetoric, coming off almost like a caricature of Apple’s exaggerated self-importance. It was by no means beloved by everyone even within Apple. The Mac’s moving up to become the company’s biggest priority hadn’t changed the determination of most of their executive wing to make it not as a maker of home and hobbyist computers, a competitor to Commodore and Atari and Radio Shack, but as a player in the much more lucrative field of business computing, where IBM (and, increasingly, IBM clones, a story for another time) ruled. Meanwhile Jobs still saw the Macintosh as he always had, as a way of changing not just the business world but the world full stop — which didn’t quite mean that he wanted to get down in the trenches with the likes of Commodore either, mind you, but also funneled his ambitions for the platform in a very different direction. Caught somewhere in the middle was John Sculley, a man who had been brought in thanks to his prowess as a consumer marketer but was nevertheless beholden to both factions. The constant push and pull between them, and the mixed messaging that resulted, would very nearly sink the Mac. Just before the Mac’s introduction, the business faction pushed through a rise in the list price from $2000 to a more businesslike $2500. But then came the “1984” commercial, whose lurid tone was all but guaranteed to repulse exactly the corporate leaders the business interests wanted to attract; these folks identified more with Big Brother than with the hammer-wielding freedom fighter. It would go on like that for a long time.

At the official launch on January 24, Jobs publicly committed Apple to the goal of selling 50,000 Macs in the first hundred days. It was dangerously ambitious; to miss the goal would be embarrassing and momentum-killing. In the end they managed it and then some; sales reached some 70,000, and they might have sold even more if not for teething problems at the factory typical of a new computer. Virtually all of the machines they sold, however, went not to corporations but to forward-thinking individuals of a certain technological bent and disposable income who rightly recognized in the Mac a new future paradigm. Douglas Adams, who saw his first Mac in Infocom’s offices and promptly fell in love, was archetypical of the demographic.

All of which was fine as far as it went — Apple was happy to sell to individuals too if they had the money to buy — but didn’t do a lot to further the dream of the Mac as a rival to the IBM PC on the desks of corporate America. Equally frustrating was much of the software that appeared that first year, which often tended toward games and other frivolous stuff frowned upon by corporations. By year’s end the early adopters with disposable income were already looking exhausted and corporations still weren’t buying. The result was tens of thousands of Macs piling up in warehouses and cancelled production orders. At year end total sales amounted to 250,000, about half of Jobs’s projections at launch time. And sales were getting worse every month, not better. It was beginning to look disconcertingly like Strike 3 — Apple III and Lisa all over again. The only thing keeping the company in the black was still the inexplicably evergreen Apple II, which in 1984, that supposed Year of the Macintosh, enjoyed its best sales yet. Revenue from the Apple II amounted to 2.5 times that from the Mac. Apple II loyalists, who despite Apple’s official claims of “Apple II Forever!” could see where the company’s real priorities lay, took no small delight in this reality.

Joanna Hoffman, the marketer who was with the Mac project almost from the beginning, frankly admitted later that the sales results were, at least in retrospect, unsurprising.

It’s a miracle that it sold anything at all. This was a computer with a single disk drive, no memory capacity, and almost no applications. People who bought it did so on seduction. It was not a rational buy. It was astonishing that Macintosh sold as many as it did.

Or, as Douglas Adams put it:

What I (and I think everybody else who bought the machine in the early days) fell in love with was not the machine itself, which was ridiculously slow and underpowered, but a romantic idea of the machine. And that romantic idea had to sustain me through the realities of actually working on the 128 K Mac€.

Those realities could be hellish. The single floppy drive combined with the inadequate memory could make the original Mac as excruciating to actually use as it was fun to wax poetic about, with the process of just copying a single disk requiring more than fifty disk swaps and twenty minutes. MacWrite, the Mac’s flagship version of that bedrock of business applications the word processor, was so starved for memory that you could only create a document of about eight pages. Determined Mac zealots swapped tips on how to chain files together to craft their Great American Novels, while the business world just shrugged and turned back to their ugly but functional WordStar screens. The Mac was a toy, at best an interesting curiosity; IBM was still the choice for real work.

"Test Drive" ad campaign

Sculley did his best to apply his Pepsi marketing genius to the Mac, but found it tough sledding. That Christmas Apple began the “Test Drive a Macintosh” campaign, which — shades of the Pepsi Challenge — let prospective buyers take a machine home for free to play with for 24 hours. Some 200,000 did so, but very few actually bought afterward, leaving stores with nothing but a bunch of used Macs to show for their trouble. For the 1985 Super Bowl, Apple attempted to recapture some of the Mac’s launch buzz with another high-concept commercial, this one depicting IBM users as mindless lemmings trudging off the side of a cliff. Ridley Scott’s brother Tony did the directing honors this time between pre-production work on Top Gun. But by now it all just felt kind of trite and childish, not to mention insulting to the very businesspeople Apple was trying to win over. Reaction from corporate America was so negative that Apple briefly considered taking out a full-page apology in the Wall Street Journal.

Apple’s summer of discontent, the rock-bottom point for the Mac, came in 1985. Not only were Mac sales still moribund, but by then another terrifying reality was becoming clear: Apple II sales were also slowing. The previous year had at last been the top of the bell curve. The day they had dreaded loomed, the day when they would have no viable next-generation machine and no faithful Apple II to fall back on. Apple closed three of their six factories and laid off 20 percent of their workforce, some 1450 people, that bleak summer.

Shortly after, Steve Jobs finally walked away from Apple following an acrimonious split with his erstwhile best mate John Sculley and a clumsy failed coup in the Apple boardroom. Jobs had proved psychologically incapable of accepting or addressing the Mac’s failings as both a piece of computer hardware and as a marketplace proposition. Jay Elliott, Apple’s head of human resources, summed up his situation beautifully:

[Jobs] could see that horizon out there, a thousand miles out. But he could never see the details of each little mile that had to be covered to get there. That was his genius and his downfall.

The Macintosh, like Apple itself, needed a practical repairman in 1985, not a bold visionary. This was a role Jobs was, at least at this phase of his life, eminently unqualified to play. And so he had made life intolerable for everyone, until the ugly public split that several generations of previous Apple management had only just found ways to avoid had come at last. The famed Apple mojo seemed all but gone, lost along with their charismatic founder.

But, as happens often (if not quite often enough) in business as in life, that summer proved to be the darkness before the dawn. Apple’s engineers had not been idle while the Mac struggled through its difficult first year, but had rather set doggedly to work to correct the worst of its failings. An external floppy drive became available a few months after launch, greatly alleviating the hell of disk swapping. The so-called “Fat Mac” with 512 K of memory, the amount most of the development team not named Jobs had agreed was appropriate from the start, appeared late in 1984. A hard disk and even cursor keys — their lack had been one of the more loathed aspects of the original machine if also a boon for makers of add-on keypads — were in the offing, as was, slowly and painfully, a workable networking system. The loss of Jobs only made such alleged dilutions of his vision easier to accomplish. The buggy original systems software was slowly tweaked and upgraded, while a third-party software ecosystem steadily grew on the backs of enthusiastic early adopters with money to spend. It didn’t come as quickly as Apple would have liked, and much of it wasn’t initially as businesslike as they might have liked, but the software — and with it a burgeoning community of famously loyal users — did come. Indeed, it was a third-party developer who arguably saved the Macintosh in tandem with another product of Apple’s busy engineering staff.

Paul Brainerd was a techie with a background in publishing who had for some time dreamed of finding a way to revolutionize the complicated and expensive process of traditional typesetting — pasteboards, huge industrial printers, and all the rest — through microcomputer technology. He had been stymied by two sore lacks: a computer with a high-resolution graphics display capable of showing what a document would look like on the printed page, pictures and all; and a printer capable of producing said document on paper. When he saw the Mac for the first time, he recognized that one of these needs had been met at last. When he reached out to Apple, they let him in on a secret: they had a solution for the other in the works as well, in the form of the LaserWriter, an affordable — in publishing terms; it would cost about $7000 — laser printer. The combination of the Mac, the LaserWriter, and the software Brainerd would eventually produce to make use of them, Aldus PageMaker, would invent the field of desktop publishing and change everything for the Mac and for Apple.

Like so much else about the Mac, it wasn’t an entirely original concept. Way back in circa 1975, Ginn & Co., a textbook publisher and Xerox subsidiary out of Boston, were gifted by the researchers at PARC with some Altos and a custom interface to hook them up to a big Dover laser printer. Ginn became the first all-digital publisher in the world. “Initially the reaction to the concept was, ‘You’re going to have to drag me kicking and screaming,'” said Tim Mott, one of the PARC people chiefly responsible for the project. “But everyone who sat in front of that system and used it, to a person, was a convert within an hour.” It was in fact Ginn’s editors who coined the ubiquitous terms “cut” and “paste,” a reference to the old manual process of cutting out manuscripts and photographs and pasting them onto pasteboard for typesetting. Now, a decade later, the rest of the world would finally get the opportunity to follow Ginn’s lead. The Mac had its killer app for business at last.

In retrospect it should have been obvious. It had been obvious to Xerox, hardly a company revered for vision; their big attempt to package PARC’s innovations into commercial form had come with the Xerox Star, a “document-processing workstation” that was essentially a sneak preview of desktop publishing before the term existed. But Apple, and especially Jobs, had been so focused on the Macintosh as a revolutionary force of nature in all aspects of the human condition that they’d had trouble thinking in terms of the concrete, practical applications that made corporations buy computers.

Publishers loved PageMaker. It turned what had been an all-night, all-hands-on-deck process, a hot, dirty nightmare of paste and print and paper for countless small periodicals and corporate publishing departments into something almost painless, something downright fun. Apple came to call PageMaker and its competitors, which were soon springing up like toadstools after a rain, their Trojan Horses. A brave purchasing manager would buy a couple of Macs and a LaserWriter as an experiment, and six months later the same company would be coming back for fifty or a hundred more. Publishing would become the first of several creative niche industries that the Mac would absolutely own, even as IBM continued to dominate the mainstream of business. It wasn’t quite the grand head-to-head challenge that Jobs had dreamed of, but, combined with sales of the Apple II that would remain on the descendent but surprisingly strong for the rest of the decade, it was a pretty good living.

Apple had been very, very lucky; they and the Mac had blundered through somehow. David Bunnell, longtime publisher of MacWorld magazine, summarized the Mac’s formative years bluntly:

To hold up the Macintosh experience as an example of how to create a great product, launch an industry, or spark a revolution is a cruel joke. Anyone who models their business startup on the Macintosh startup is doomed to failure. Miracles like the Macintosh can only happen once.

If the bargain with practicality represented by the Macintosh as desktop-publishing specialist seems disheartening, consider how genuinely empowering just this application was to countless people. For it wasn’t just big or medium-sized companies who bought Macs for this purpose. Especially as the prices of software and hardware came down, the small printers, the neighborhood associations, the church groups could also get in on the act. It’s astonishing how ugly the average fanzine or newsletter of 1980 is compared to that of 1995. The difference is almost entirely down to the Macintosh, which let people get their messages out there in a form of which no one need be embarrassed. Many, like a young man named Eliot Cohen who used his Mac to start a popular newsletter focusing on his obsession of New York Mets baseball and soon found himself in the locker room interviewing his heroes as the slick magazines called to beg for his insights, credited the Mac with literally changing their lives. This democratizing of the means of production is one of the most inspiring outcomes of the PC revolution and, much as I’m ambivalent about some aspects of the platform and its parent company, of the Mac itself. Indeed, I have a special reason for giving credit where it’s due: the logical successors to the Mac-enabled fanzines that were everywhere by the early 1990s are blogs like this one. We’re still riding that same continuum of change.

Consider also how immense was the Mac’s soft power. People — even people who rejected the Mac itself as an overpriced boondoggle — somehow recognized that this was the way computers really ought to work. It became an ideal, striven for if seldom reached for years. No matter; other computers were better for the striving. Even machines like the lowly Commodore 64 soon housed their own valiant attempts at replicating MacOS. To really get the scope of the changes wrought by the Mac, one need only compare the average Commodore 64 or Apple II game of, say, 1983 and 1986. A friendly GUI interface, of the sort which felt revolutionary when it appeared in the landmark Pinball Construction Set in 1983, was practically the baseline norm by 1986. The hardware hadn’t changed a whit; the vision of what could be done with it had. So, the Macintosh really did end up changing the world. Steve Jobs, wrong about so many nitpicky things, was breathtakingly right about that.

(The Macintosh story has been told so often and by so many that the biggest problem in writing an article like this one is sorting through it all and trying to inject some grounding into the more evangelistic accounts. My primary book sources were Insanely Great by Steven Levy; West of Eden by Frank Rose; Apple Confidential by Owen Linzmayer; and Dealers of Lightning by Michael A. Hiltzik. Andy Hertzfeld’s is also a goldmine. The Byte quote given above is from the February 1984 issue, part of a series of features greeting the Mac’s arrival. Various episodes of Computer Chronicles, archived by the dedicated folks at, also informed the article. See in particular “Mainframes to Minis to Micros”; “Integrated Software”; “Printers”; “Computer Ergonomics”; “The Macintosh Computer”; “Computer Graphics”; “Slowdown in the Silicon Valley” Parts One and Two; “Printers and Business Graphics”; and “Desktop Publishing” Parts One and Two. The photos sprinkled through the article are from Apple Confidential, except for the picture of the original Mac keyboard, which was taken from the aforementioned issue of Byte.)


Posted by on February 20, 2014 in Digital Antiquaria, Interactive Fiction



QL Pawn

Clive Sinclair and Anita Sinclair

Clive Sinclair and Anita Sinclair

Flop though it was in the big picture, the Sinclair QL still managed to attract some tens of thousands of loyal users during its brief commercial lifespan, some of whom still persist with the flawed but oddly endearing little machine to this day. The user base was big enough to support a small commercial software market which included some games on offer. Because the supposedly business-focused QL wasn’t up to all that much graphically, a good proportion of the games were text adventures. Among this group was by far the most remembered piece of software born on the QL — indeed, for a certain group of enthusiasts the only reason the QL is remembered at all. I’m referring of course to The Pawn, the first game from the last of the great 1980s text-adventure houses to emerge, Magnetic Scrolls. Like so many other stories I’ve already told, theirs begins with a few young friends with a shared love of computers and Dungeons and Dragons.

Ken Gordon, Hugh Steers, and Rob Steggles all grew up together in the London suburb of Woolwich. Gordon and Steers were both dedicated hackers. Steggles was less technical, the “tag-along” member of the trio. He did, however, possess another talent which made Gordon and Steers value his company: he had a knack for devising clever scenarios for D&D, which he employed to good effect as Dungeon Master of their little role-playing group. All three were also avid players of adventure games on the computer. They played all the usual suspects in the Britain of the early 1980s, but, crucially, also had exposure to the works of Infocom thanks to the exotic and expensive Apple II the Gordon family owned in preference to the more typical Speccy or BBC Micro. The lives and works of these young men would be shadowed and to some extent guided by the work of that other great adventure-game house across the ocean for years to come. Gordon and Steers spent much time analyzing the Infocom games and tinkering with adventure-game engines and parsers, dreaming of their own games that could live up to the standards of Infocom.

Their hobby would almost certainly have come to nothing more had Gordon not had another friend just a couple of years older, an ambitious and driven young woman named Anita Sinclair who possessed technical talents of her own to go with charm and a flair for business and marketing. Although she was of no relation to Sir Clive, she nevertheless had the perfect name to work at his company. That’s exactly what she did for a time, becoming Sir Clive’s personal aide and one of his firm favorites in the process; their friendship would persist for years, outliving the heydays of both Sinclair Research and Magnetic Scrolls.

Ken, Hugh, and Anita all met in her flat on the evening before Ken and Hugh were to take their A-Levels. There they talked code and business and Hugh demonstrated a parser he had written. The three decided to go into business together writing games for a new platform from which Anita’s keenly honed nose detected the smell of opportunity: the Sinclair QL.

The QL may have been conceived as Sinclair Research’s “professional” machine, but even businesspeople would presumably want to have fun sometimes. Anita initially wanted to make arcade games; they could be finished and released relatively cheaply and quickly. Ken and Hugh, however, had grander ambitions: to form a company and develop the technology to make adventure games that could compete head to head with Infocom. Hugely popular though they were, British adventure games had so far been limited by the more limited British hardware: 48 K or even as little as 32 K of memory rather than the more customary 64 K of North America, and no disk drive to use for virtual memory. Companies like Level 9 had pushed that hardware to incredible places, but there were inevitable limits. The QL, though, came with a full 128 K, a figure which coincidentally was also the maximum size of a game written using Infocom’s Z-Machine virtual machine. Thus the trio could be the first developers in Britain to meet Infocom on a level playing field.

In time Ken and Hugh infected Anita with their bug as well. She thus proposed to Sir Clive that she and her friends deliver a platform-exclusive next-generation adventure game for the QL, to be published by Sinclair themselves to give the game an official blessing of sorts. He agreed. Financing the venture largely through a certain amount of familial wealth that Anita had at her disposal, the trio of friends started Magnetic Scrolls — wonderful name, isn’t it? — in a tiny office in the district of Eltham, southeastern London, in the spring of 1984.

In their technical approach to game-writing as in so much else, they were determined to follow Infocom’s lead as much as possible. In lieu of Infocom’s famous DEC minicomputer, they built a network of four Apple IIs linked to a central hard drive. They used this to write in a subset of 68000 assembly language, which they named ELTHAM: “Extra Low-Tech Highly Ambiguous Metacode,” thus showing if nothing else that they had a sense of humor about the whole endeavor. A cross-compiler translated the ELTHAM code into a form suitable for running on the QL. Over long hours in the cramped office they applied all of the lessons they had learned as hobbyists and Infocom fanatics to an adventure-game engine with a complicated world model and a full-sentence parser the likes of which had never yet been seen from a British developer.

Still, a great adventure is not the result of its technical underpinnings alone. They also needed a game to put in there. Gordon and Steers thought of their old Dungeon Master Rob Steggles, who had just come back home for his first summer break from university (non-technical to the end, he was studying philosophy). Steggles:

They called me up and asked me to write a scenario to run on the system before I went off to university in the autumn. The scenario was The Pawn. What we did with The Pawn was get together as a group and talk for hours until we came up with a whole bunch of disparate, bizarre ideas. We wrote them all on several A1 sheets, put them on the wall, and looked at them for a while. Then I sat down and wrote a story around them.

Steggles then returned to university, where he “thought no more about it.” The others, however, went to work translating his notes into a game. What they had originally conceived as a six-month project turned into a long, hard slog for this group of clever but largely self-taught hackers venturing far into unfamiliar territory without the benefit of Infocom’s grounding in computer science from MIT. They continued to spend at least as much time on their recalcitrant toolchain as the actual game. Anita Sinclair (translated from the German original in Happy Computer magazine by me):

When The Pawn was more or less finished, we began debugging — looking for problems. To help us with that we had another program, a debugger, which worked similarly to a machine-language monitor. Our debugger told us that we had a failure in such and such a place. But no matter how much we pored over the code we couldn’t find the mistake. At last somebody figured out that the debugger itself had a bug.

It took more than a year after Steggles’s involvement ended for the final game to appear under Sinclair’s trademark gray livery in late 1985.

The Pawn, QL version

The Pawn on the QL

The Pawn on the QL

The Pawn is sadly not a game that stands up terribly well almost thirty years later, nor one for which I can muster much affection even after applying the requisite gauzy haze of nostalgia. It received its name late in development from a friend who played it named Tony Lambert, who noted that the player is kept in a constant state of bewilderment and is constantly being used and manipulated by those around him, like a high-fantasy version of The Band’s “The Weight.” And, indeed, The Pawn is a nasty, nihilistic little game with a mean streak unusually wide even for its era. This is a game which delights in breaking certain objects — like, inexplicably, a wooden bowl — when you DROP them without bothering to tell you about it until you try to use them dozens or hundreds of moves later; a game which expects you to commit cold-blooded murder at one point; a game which can’t even be bothered to render up a proper victory screen at the end, but instead lets you spend eternity wandering its landscape as a helplessly homeless ghost. It also has literally some of the worst puzzles I’ve ever seen — and, as you’ve probably gathered by now, I’ve seen a lot of puzzles.

A classic face-palm, almost so bad it’s good, comes in a secret room you discover inside a tree.

>enter door
Tree Trunk Room
You are in a small, cramped room inside the tree trunk. The floor is covered with varnished wooden planks and light filters through the various knots and holes further up the trunk.

There are some boards here. One of the floorboards seems quite loose.

>examine floorboards
The boards are large and very solid.

>examine loose floorboard
I don't understand loose

>lift floorboard
The boards are too heavy to lift.

>get floorboards
The boards are too heavy to lift.

>pull floorboard
You cannot get a grip on the boards

>touch floorboard
The boards are slippery

>slide floorboard
The boards are too heavy to lift.

>pry floorboard with hoe
I don't understand pry

>lever floorboard with hoe
Nothing happens.

>pull floorboard with hoe
You cannot get a grip on the boards

>examine door
The door is heavily camouflaged to make it look like part of the tree.

It is open

>look under door
You find nothing of interest

>close door
The door is now closed.

>lift floorboard
A dark forbidding opening is revealed leading down...

This sequence could almost stand in for the rest of this review, as it so nicely illustrates everything that’s wrong with The Pawn: the parser that’s superficially smart but doesn’t understand basic synonyms like “PRY” (or the more British “PRISE”) for “LEVER”; the world model that doesn’t distinguish between the loose floorboard and the rest of them; the refusal to just tell you what the hell you’re seeing so you can apply some sort of human logic to solving the puzzles. What’s apparently really going on above, if you haven’t gathered it already (and no shame in that), is that the open door was covering the loose board, preventing you from raising it. I must say “apparently,” of course, because the game refuses to tell you a word about something that would be completely obvious to you if you were actually standing there — this even though I’m desperately trying to give it the benefit of the doubt, looking hard at the door for some obscure message. This and some of The Pawn‘s other puzzles deserve to join the worst of Roberta Williams deep, deep down in adventuring hell.

I do want to say a little bit more about the parser because in its own way it’s as wrong-headed as many of the puzzles. Magnetic Scrolls loved to trumpet its ability to handle thorny constructions like “USE THE TROWEL TO PLANT THE POT PLANT IN THE PLANT POT”; in fact, I suspect that the very reason those items exist in the game is to show off the parser. And it worked — reviewers would dutifully type in such sequences when prompted by their press materials and write the requisite amazed copy. The Pawn‘s parser is, however, an absurdly brittle creation. If you know the exact formulations it expects, you can indeed use it in impressive ways. You can, for instance, “TIE THE HOE AND THE RAKE TOGETHER WITH THE TEE-SHIRT.” (This is another puzzle solution that makes no sense to anyone who’s ever seen the real-world analogues of The Pawn‘s hoe, rake, and tee-shirt, but never mind, I’m complaining about something else now.) But what if you try to “TIE SHIRT TO HOE AND RAKE”? Not only does the parser not understand — it misunderstands, leaving you with the shirt tied to the hoe and the last of the command blissfully ignored. Players don’t need a parser that gets all dressed up in its little sailor suit to do fancy tricks for the journalists, then collapses into a tantrum when asked to do a practical job. No, they need one that understands a wide variety of simple, commonsense constructions, and one that when it doesn’t understand doesn’t pretend it does but rather — and this is critical — gives good feedback on exactly what confused it so you can try again forewarned and forearmed. For all its fun with potted plants, The Pawn‘s parser still has a long way to go to match Infocom’s. It can at times be even more challenging to get sense out of than a two-word job because of the combinatorial explosion inherent in finding the exact, idiosyncratic phraseology it expects.

Formally, The Pawn is much more in the mold of Zork than any of Infocom’s later titles: a diverse, nonlinear grab-bag of disparate gags and puzzles and geography held together by little in the way of plot coherence. Some of the humor is kind of dumb but amusing; you play a stoner with a “green design of a plant that has seven jagged-edged leaves” on your tee-shirt, and further references to your love for pot abound (see the “pot plant”) without the game ever just coming out and saying it. (In retrospect I’m surprised that the guardians of adolescent virtue never caught onto this and condemned the game. Perhaps — incredibly! — it was too subtle…) Much of it, though, is just dumb full stop, such as the horse you meet who talks like Mister Ed. (“Get on up. Heyyy, you’re good. D’you ever star in a cowboy movie?”) And the less said about the cameo by Jerry Lee Lewis of all people the better…

As far as I can make out the game wants to be a cutting satire of adventure-game clichés, but its most amusing riffs are themselves lifted from other, smarter commentaries, such as the adventurer from Enchanter (“A tall, handsome man dressed in gleaming armour. He occasionally says things like ‘go north’, ‘get sword,’ and ‘unlight lamp.'”) and the meta-revelations sprinkled throughout the game that it really does all take place inside a computer, a trope that dates back at least to Microsoft’s 1979 version of the Crowther and Woods Adventure. The more original stabs at satire, such as the useless and inescapable maze with a sign outside labeling it “totally irrelevant to the adventure,” cut like a butter knife.

Maybe the kindest thing I can say about The Pawn is that it gave Magnetic Scrolls a suitably low baseline upon which to improve markedly and rapidly in games to come. That they had the opportunity to make more games is largely down to the talent shown by the core trio, particularly Anita Sinclair, for sensing which way the winds were blowing and planning accordingly. As work on the QL version of The Pawn proceeded, it was becoming more and more obvious that that platform was not likely to recover from its disastrous launch sufficiently to support a dedicated adventure-game developer. Luckily, new 68000-based machines were appearing in the form of the Atari ST and the Commodore Amiga; both offered more memory, spectacular graphics capabilities, and much better prospects for commercial success. By the time The Pawn appeared on the QL to predictably underwhelming sales, Anita and company had already initiated the process of porting the game to other machines, a task greatly eased by the development system they had spent so much time building. Soon gamers would look at the company’s early days as a QL-exclusive developer as little more than prehistory, a footnote to the real story.

That’s a story that I’ll be continuing in the course of time. But now it’s time to leave British shores and return Stateside, where an elephant in the room in the form of a certain paradigm-shifting platform introduction has been awaiting his due patiently for a long time now.

(The definitive source for information about Magnetic Scrolls on the Web is the Magnetic Scrolls Memorial, full of contemporary articles and even some memories from Rob Steggles. You can also find the game files themselves and interpreters to run them there. Francesco Cordella’s L’avventura è l’avventura hosts an excellent interview with Steggles. Useful magazine sources: Commodore User of December 1986; Happy Computer of January 1987; ZX Computing of August 1986; Popular Computing Weekly of October 10 1985; Computer and Video Games of October 1987; Home Computing Weekly of May 3 1983; Your Computer of January 1988. The picture at the beginning of this article is from the August 1986 ZX Computing.)


Posted by on February 11, 2014 in Digital Antiquaria, Interactive Fiction


Tags: , ,

This Tormented Business, Part 2

In December of 1984 Sir Clive Sinclair and Chris Curry, heads of those leading lights of the British PC revolution Sinclair Research and Acorn Computers respectively, gave a Daily Mirror columnist named Michael Jeacock a Christmas gift for the ages. Like Jeacock, Sinclair and Curry were having a drink — separately — with colleagues in the Baron of Beef pub, a popular watering hole for the hackers and engineers employed in Cambridge’s “Silicon Fen.” Spotting his rival across the room, Sinclair marched up to him and started to give him a piece of his mind. It seemed he was very unhappy about a recent series of Acorn advertisements which accused Sinclair computers of shoddy workmanship and poor reliability. To make sure Curry fully understood his position, he emphasized his words with repeated whacks about the head and shoulders with a rolled-up newspaper. Curry took understandable exception, and a certain amount of pushing and shoving ensued, although no actual punches were thrown. The conflict apparently broke out again later that evening at Shades, a quieter wine bar to which the two had adjourned to patch up their differences — unsuccessfully by all indications.

If you know anything about Fleet Street, you know how they reacted to a goldmine like this. Jeacock’s relatively staid account which greeted readers who opened the Christmas Eve edition of the Daily Mirror was only the beginning. Soon the tabloids were buzzing gleefully over what quickly became a full-blown “punch-up.” Some wrote in a fever of indignation over such undignified antics; Sinclair had just been knighted, for God’s sake. Others wrote in a different sort of fever: another Daily Mirror columnist, Jean Rook, wrote that she found Sinclair’s aggression sexually exciting.

It would be a few more months before the British public would begin to understand the real reason these middle-aged boffins had acted such fools. Still heralded publicly as the standard bearers of the new British economy, they were coming to the private realization that things had all gone inexplicably, horribly wrong for their companies. Both were staring down a veritable abyss, with no idea how to pull up or leap over. They were getting desperate — and desperation makes people behave in undignified and, well, desperate ways. They couldn’t even blame their situations on fate and misfortune, even if 1984 had been a year of inevitable changes and shakeouts which had left the software industry confused by its contradictory signs and portents and seen the end or the beginning of the end of weak sisters on the hardware side like Dragon, Camputers, and Oric. No, their situations were directly attributable to decisions they had personally made over the last eighteen months. Each made many of these decisions against his better judgment in the hope of one-upping his rival. Indeed, the corporate rivalry that led them to a public bar fight — and the far worse indignities still to come — has a Shakespearian dimension, being bound up in the relationship between these two once and future friends, each rampantly egotistical and deeply insecure in equal measure and each coveting what the other had. Rarely does business get so personal.

Acorn’s flagship computer, the BBC Micro, is amusingly described by Francis Spufford in Backroom Boys as the Volvo of early British computers: safe, absurdly well-engineered and well-built, expensive, and just a little bit boring. Acorn had taken full advantage of the BBC’s institutional blessing to sell the machine in huge quantities to another set of institutions, the British school system; by the mid-1980s some 90% of British schools had BBC Micros on the premises. Those sales, combined with others to small businesses and to well-heeled families looking for a stolid, professional-quality machine for the back office — i.e., the sorts of families likely to have a Volvo in the driveway as well — were more than enough to make a booming business of Acorn.

Yet when the person on the street thought about computers, it wasn’t Curry’s name or even Acorn’s that popped first to mind. No, it was the avuncular boffin Uncle Clive and his cheap and cheerful Spectrum. It was Sinclair who was knighted for his “service to British industry”; Sinclair who was sought out for endless radio, television, and print interviews to pontificate on the state of the nation. Even more cuttingly, it was the Spectrum that a generation of young Britons came to love — a generation that dutifully pecked out their assignments on the BBC Micros at their schools and then rushed home to gather around their Speccys and have some fun. Chris Curry wanted some of their love as well.

Acorn Electron

Enter in 1983 the Acorn Electron, a radically cost-reduced version of the BBC Micro designed to take on the Spectrum on its own turf. Enthusiasm for the Electron amongst the rank and file at Acorn was questionable at best. Most were not afflicted with Curry’s need to show up his old boss, but rather manifested a strain of stuffy Cambridge elitism that would cling to Acorn throughout its history. They held Sinclair’s cheap machines and the games they played in a certain contempt. They were happy to cede that segment to him, would rather be working on innovative new technology — Acorn had already initiated a 32-bit RISC processor project that would eventually result in the ubiquitous ARM architecture that dominates smartphones and tablets today — than repackaging old for mewling schoolchildren. Curry had to struggle mightily to push the Electron project through in the face of such indifference.

A price of £200, about half that of the BBC Micro, would get buyers the same 32 K of memory and the same excellent BASIC, albeit in a smaller, less professional case. However, the Electron’s overall performance was sharply curtailed by an inefficient (but cheaper) new memory configuration. The Electron’s sound capabilities also suffered greatly by comparison with its big brother, and the BBC Micro’s Mode 7, a text-only display mode that programmers loved because it greatly reduced the amount of precious memory that needed to be allocated to the display, was eliminated entirely. And, much cheaper than the BBC Micro though it may have been, it was still more expensive than the Spectrum. On paper it would seem quite a dubious proposition. Still, a considerable number of punters went for it that Christmas of 1983, the very peak of the British micro boom. Many were perhaps made willing to part with a bit more cash by the Electron’s solidity and obviously superior build quality in comparison to the Speccy.

But now Curry found himself in a truly heartbreaking position for any captain of industry: he couldn’t meet the demand. Now that it was done, many months behind schedule, problems with suppliers and processes which no one had bothered to address during development meant that Electrons trickled rather than poured into stores. “We’re having to disappoint customers,” announced a spokeswoman for W.H. Smith. “We are not able to supply demand. What we have has sold out, and while we are expecting more deliveries the amount will still be well below demand.” By some estimates, Acorn missed out on as many as 100,000 Electron sales that Christmas. Worse, most of those in W.H. Smith and other shops who found the Electrons sold out presumably shrugged and walked away with a Spectrum or a Commodore 64 instead — mustn’t disappoint the children who expected to find a shiny new computer under the tree.

Never again was the lesson that Curry took away from the episode. Whatever else happened, he was damn sure going to have enough Electrons to feed demand next Christmas. Already in June of 1984 Curry had Acorn start placing huge orders with suppliers and subcontractors. He filled his warehouses with the things, then waited for the big Christmas orders to start. This time he was going to make a killing and give old Clive a run for his money.

The orders never came. The home-computer market had indeed peaked the previous Christmas. While lots of Spectrums were sold that Christmas of 1984 in absolute numbers, it wasn’t a patch on the year before. And with the Spectrum more entrenched than ever as the biggest gaming platform in Britain, and the Commodore 64 as the second biggest, people just weren’t much interested in the Electron anymore. Six months into the following year Acorn’s warehouses still contained at least 70,000 completed Electrons along with components for many more. “The popular games-playing market has become a very uncomfortable place to be. Price competition will be horrific. It is not a market we want to be in for very long,” said Curry. The problem was, he was in it, up to his eyebrows, and he had no idea how to get out.

Taking perhaps too much to heart Margaret Thatcher’s rhetoric about her country’s young microcomputer industry as a path to a new Pax Britannia, Curry had also recently made another awful strategic decision: to push the BBC Micro into the United States. Acorn spent hugely to set up a North American subsidiary and fund an advertising blitz. They succeeded only in learning that there was no place for them in America. The Apple II had long since owned American schools, the Commodore 64 dominated gaming, and IBM PCs and compatibles ruled the world of business computing. And the boom days of home computing were already over in North America just as in Britain; the industry there was undergoing a dramatic slowdown and shakeout of its own. What could an odd British import with poor hardware distribution and poorer software distribution do in the face of all that? The answer was of course absolutely nothing. Acorn walked away humbled and with £10 to £12 million in losses to show for their American adventure.

To add to the misery, domestic sales of the BBC Micro, Acorn’s bread and butter, also began to collapse as 1984 turned into 1985. Preoccupied with long-term projects like the RISC chip as well as short-term stopgaps like the Electron, Acorn had neglected the BBC Micro for far too long. Incredibly, the machine still shipped with just 32 K of memory three years after a much cheaper Spectrum model had debuted with 48 K. This was disastrous from a marketing standpoint. Salespeople on the high streets had long since realized that memory size was the one specification that virtually every customer could understand, that they used this figure along with price as their main points of comparison. (It was no accident that Commodore’s early advertising campaign for the 64 in the United States pounded relentlessly and apparently effectively on “64 K” and “$600” to the exclusion of everything else.) The BBC Micro didn’t fare very well by either metric. Meanwhile the institutional education market had just about reached complete saturation. When you already own 90% of a market, there’s not much more to be done there unless you come up with something new to sell them — something Acorn didn’t have.

How was Acorn to survive? The City couldn’t answer that question, and the share price therefore plunged from a high of 193p to as low as 23p before the Stock Exchange mercifully suspended trading. A savior appeared just in time in the form of the Turin, Italy-based firm Olivetti, a long-established maker of typewriters, calculators, and other business equipment, including recently PCs. Olivetti initially purchased a 49 percent stake in Acorn. When that plus the release of a stopgap 64 K version of the BBC Micro failed to stop the bleeding — shares cratered to as low as 9p and trading had to be suspended again — Olivetti stepped in again to up their stake to 80 percent and take the company fully under their wing. Acorn would survive in the form of an Olivetti subsidiary to eventually change the world with the ARM architecture, but the old dream for Acorn as a proudly and independently British exporter and popularizer of computing was dead, smothered by, as wags were soon putting it, “the Shroud of Turin.”

If Chris Curry wanted the popular love that Clive Sinclair enjoyed, Sir Clive coveted something that belonged to Curry: respectability. The image of his machines as essentially toys, good for games and perhaps a bit of BASIC-learning but not much else, rankled him deeply. He therefore decided that his company’s next computer would not be a direct successor to the Spectrum but rather a “Quantum Leap” into the small-business and educational markets where Acorn had been enjoying so much success.

He shouldn’t have bothered. While the Electron was a competent if somewhat underwhelming little creation, the Sinclair QL was simply botched every which way from Tuesday right from start to finish. Apparently for marketing reasons as much as anything else, Sir Clive decided on a chip from the new Motorola 68000 line that had everyone talking. Yet to save a few pounds he insisted that his engineers use the 68008 rather than the 68000 proper, the former being a crippled version of the latter with an 8-bit rather than 16-bit data bus and, as a result, about half the overall processing potential. He also continued his bizarre aversion to disk drives, insisting that the QL come equipped with two of his Microdrives instead — a classically Sinclairian bit of tortured technology that looked much like one of those old lost and unlamented 8-track audio tapes and managed to be far slower than a floppy disk and far less reliable than a cassette tape (previously the most unreliable form of computer storage known to man). The only possible justification for the contraption was sheer bloody-mindedness — or anticipation of the money Sinclair stood to make as the sole sellers of Microdrive media if they could ever just get the punters to start buying the things. These questionable decisions alone would have been enough to torpedo the QL. They were, however, just the tip of an iceberg. Oh, what an iceberg…

The QL today feels like an artifact from an alternate timeline of computing in which the arrival of new chips and new technologies didn’t lead to the paradigm shifts of our own timeline. No, in this timeline things just pretty much stayed as they had been, with computers booting up to a BASIC environment housed in ROM and directed via arcane textual commands. The QL must be one of the most profoundly un-visionary computers ever released. The 68000 line wasn’t important just because it ran faster than the old 8-bit Z80s and 6502s; Intel’s 16-bit 8086 line had been doing that for years. It was important because, among other things, its seven levels of external interrupts made it a natural choice for the new paradigm of the graphical user interface and the new paradigm of programming required to write for a GUI: event-driven (as opposed to procedural) programming. This is the reason Apple chose it for their revolutionary Lisa and Macintosh. Sinclair, however, simply used a 68008 like a souped-up Z80, leaving one feeling like they’ve rather missed a pretty significant point. It’s an indictment that’s doubly damning in light of Sir Clive’s alleged role at Sinclair as a sort of visionary-in-chief — or, to choose a particularly hyperbolic contemporary description from The Sun, as “the most prodigious inventor since Leonardo.” But then, as we shall see, computers didn’t ultimately have a lot to do with Sir Clive’s visions.

Clive Sinclair launches the QL

The big unveiling of the QL on January 12, 1984, was a landmark of smoke and mirrors even by Sinclair’s usual standards. Sir Clive declared there that the QL would begin shipping within 28 days to anyone who cared to order one at the low price of £400, despite the fact that no functioning QL actually existed. I don’t mean, mind you, that the prototypes had yet to go into production. I mean rather that no one at Sinclair had yet managed to cobble together a single working machine. Press in attendance were shown non-interactive demonstrations played back on monitors from videotape, while the alleged prototype was kept well away from them. Reporters were told that they could book a review machine, to be sent to them “soon.”

The question of just why Sinclair was in such a godawful hurry to debut the QL is one that’s never been satisfactorily answered. Some have claimed that Sir Clive was eager to preempt Apple’s unveiling of the Macintosh, scheduled for less than two weeks later, but I tend to see this view as implying an awareness of the international computer industry and trends therein that I’m not sure Sir Clive possessed. One thing, however, is clear: the oft-repeated claim that the QL represents the first mass-market 68000-based computer doesn’t hold water. Steve Jobs debuted a working Macintosh on January 24, 1984, and Apple started shipping the Macintosh months before Sinclair did the QL.

As those 28 days stretched into months, events went through the same cycle that had greeted previous Sinclair launches: excitement and anticipation fading into anger and accusations of bad faith and, soon enough, yet another round of investigations and threats by the Advertising Standards Authority. Desperate to show that the QL existed in some form and avoid legal action on behalf of the punters whose money they’d been holding for weeks or months, Sinclair hand-delivered a few dozen machines to journalists and customers in April. These sported an odd accessory: a square appendage hanging off the back of the otherwise sleek case. It seems Sinclair’s engineers had realized at some late date that they couldn’t actually fit everything they were supposed to inside the case. By the time QLs finally started shipping in quantity that summer the unwanted accessory had been removed and its contents somehow stuffed inside the case proper, but that turned out to have been the least of the machine’s problems.

Early Sinclair complete with dongle

Amongst the more troubling of these was a horrid keyboard, something of another Sinclair tradition by now. Sinclair did deign to give the new machine actual plastic keys in lieu of the famous “dead flesh” rubber keys of the Spectrum, but the keys still rested upon a cheap membrane rather than having the mechanical action of such high-flying competitors as the Commodore VIC-20. The keyboard was awful to type on, a virtual kiss of death all by itself for a supposed business computer. And it soon emerged that the keyboard, like everything else on the QL, didn’t work properly on even its own limited terms. Individual keys either stuck or didn’t register, or did both as the mood struck them. Reports later emerged that Sinclair had actually solicited bids for a mechanical keyboard from a Japanese manufacturer and found it would cost very little if anything more than the membrane job, but elected to stick with the membrane because it was such a “Sinclair trademark.” The mind boggles.

And then there were the performance problems brought on by a perfect storm of a crippled CPU, the Microdrives, and the poorly written business software that came with the machine. Your Computer magazine published the following astonishing account of what it took to save a 750-word document in the word processor:

1. Press F3 key followed by 6. A period of 35 seconds elapses by which time the computer has found the save section of Quill and then asks if I wish to save the default file, i.e. the file I am working on.

2. Press ENTER. After a further 10 seconds the computer finds that the file already exists and asks if I wish to overwrite it.

3. Press Y. A period of 100 seconds elapses while the old file is erased and the new one saved and verified in its place. The user is then asked if he wishes to carry on with the same document.

4. Press ENTER. Why a further 25 seconds is required here is beyond me as the file must be in memory as we have just saved it. Unfortunately, the file is now at the start, so to get back to where I was:

5. Press F3 key then G followed by B. The Goto procedure to get to the bottom of the file, a further 28 seconds.

For those keeping score, that’s 3 minutes and 18 seconds to save a 750-word document. For a 3000-word document, that time jumped to a full five minutes.

Your Computer concluded their review of the QL with a prime demonstration of the crazily mixed messaging that marked all coverage of the machine. It was “slightly tacky,” “the time for foisting unproven products on the marketplace has gone,” and “it would be a brave business which would entrust essential data to Microdrives.” Yet it was also a “fascinating package” and “certain to be a commercial success.” It arguably was “fascinating” in its own peculiar way. “Commercial success,” however, wasn’t in the cards. Sinclair did keep plugging away at the QL for months after its release, and did manage to make it moderately more usable. But the damage was long since done. Even the generally forgiving British public couldn’t accept the eccentricities of this particular Sinclair creation. Sales were atrocious. Still, Sir Clive, never one to give up easily, continued to sell and promote it for almost two years.

There’s a dirty secret about Sir Clive Sinclair the computer visionary that most people never quite caught on to: he really didn’t know that much about computers, nor did he care all that much about them. Far from being the “most prodigious inventor since Leonardo,” Sir Clive remained fixated for decades on exactly two ideas: his miniature television and his electric car. The original Sinclair ZX80 had been floated largely to get Sinclair Research off the ground so that he could pursue those twin white whales. Computers had been a solution to a cashflow problem, a means to an end. His success meant that by 1983 he had the money he needed to go after the television and the car, the areas where he would really make his mark, full on. Both being absolutely atrocious ideas, this was bad, bad news for anyone with a vested interest in Sinclair Research.

The TV80 was a fairly bland failure by Sinclair standards: he came, he spent millions manufacturing thousands of devices that mostly didn’t work properly and that nobody would have wanted even if they had, and he exited again full of plans for the next Microvision iteration, the one that would get it right and convince the public at last of the virtues of a 2-inch television screen. But the electric car… ah, that one was one for the ages, one worthy of an honored place beside the exploding watches of yore. Sir Clive’s C5 electric tricycle was such an awful idea that even his normally pliable colleagues resisted letting Sinclair Research get sucked up in it. He therefore took £8.6 million out to found a new company, Sinclair Vehicles.

The biggest problem in making an electric car, then and now, is developing batteries light enough, powerful enough, and long-lasting enough to rival gasoline or diesel. Researchers were a long way away still in 1984. A kilogram of gasoline has an energy potential of 13,000 watt-hours; a state-of-the-art lead-acid battery circa 1984 had an energy potential of 50 watt-hours. That’s the crux of the problem; all else is relative trivialities. Having no engineering solution to offer for the hard part of the problem, Sinclair solved it through a logical leap that rivals any of Douglas Adams’s comedic syllogisms: he would simply pretend the hard problem didn’t exist and just do the easy stuff. From his adoring biography The Sinclair Story:

Part of the ground-up approach was not to spend enormous amounts trying to develop a more efficient battery, but to make use of the models available. Sinclair’s very sound reasoning was that a successful electric vehicle would provide the necessary push to battery manufacturers to pursue their own developments in the fullness of time; for him to sponsor this work would be a misplacement of funds.

There’s of course a certain chicken-or-egg problem inherent in this “sound reasoning,” in that the reason a “successful electric vehicle” didn’t yet exist was precisely because a successful electric vehicle required improved battery technology to power it. Or, put another way: if you could make a successful electric vehicle without improved batteries, why would its existence provide a “push to battery manufacturers?” Rather than a successful electric vehicle, Sir Clive made the QL and Black Watch of electric vehicles all rolled into one, an absurd little tricycle that was simultaneously underwhelming (to observe) and terrifying (to actually drive in traffic).

Sinclair C5

He unveiled the C5 on January 10, 1985, almost exactly one year after the QL dog-and-pony show and for the same price of £400. The press assembled at Alexandria Palace couldn’t help but question the wisdom of unveiling an open tricycle on a cold January day. But, once again, logistics were the least of the C5’s problems. A sizable percentage of the demonstration models simply didn’t work at all. The journalists dutifully tottered off on those that did, only to find that the advertised top speed of 15 mph was actually more like 5 mph — a brisk walking speed — on level ground. The batteries in many of the tricycles went dead or overheated — it was hard to tell which — with a plaintive little “Peep! Peep!” well before their advertised service range of 20 miles. Those journalists whose batteries did hold out found that they didn’t have enough horsepower to get up the modest hill leading back to the exhibition area. It was a disgruntled and disheveled group of cyclists who straggled back to Sir Clive, pedaling or lugging the 30-kilogram gadgets alongside. They could take comfort only in the savaging they were about to give him. When the press found out that the C5 was manufactured in a Hoover vacuum-cleaner plant and its motor was a variation on one developed for washing machines, the good times only got that much better. If there’s a single moment when Sir Clive turned the corner from visionary to laughingstock, this is it.

Sinclair Research wasn’t doing a whole lot better than its founder as 1984 turned into 1985. In addition to the huge losses sustained on the QL and TV80 fiascoes, Sinclair had, like Acorn, lost a bundle in the United States. Back in 1982, they had cut a deal with the American company Timex, who were already manufacturing all of their computers for them from a factory in Dundee, Scotland, to export the ZX81 to America as the Timex Sinclair 1000. It arrived in July of 1982, just as the American home-computing boom was taking off. Priced at $99 and extravagantly advertised as “the first computer under $100,” the TS 1000 sold like gangbusters for a short while; for a few months it was by far the bestselling computer in the country. But it was, with its 2 K of memory, its calculator keyboard, and its blurry text-only black-and-white display, a computer in only the most nominal sense. When Jack Tramiel started in earnest his assault on the low-end later in the year with the — relatively speaking — more useful and usable Commodore VIC-20, the TS 1000 was squashed flat.

Undeterred, Timex and Sinclair tried again with an Americanized version of the Spectrum, the TS 2068. With the best of intentions, they elected to improve the Speccy modestly to make it more competitive in America, adding an improved sound chip, a couple of built-in joystick ports (British Speccy owners had to buy a separate interface), a couple of new graphics modes, a cartridge port, even a somewhat less awful version of Sinclair’s trademark awful keyboards. The consequence of those improvements, however, was that most existing Spectrum software became incompatible. This weird little British machine with no software support was priced only slightly less than the Commodore 64 with its rich and growing library of great games. It never had a chance. Timex, like other big players such as Texas Instruments and Coleco, were soon sheepishly announcing their withdrawal from the home-computer market, vanquished like the others by Commodore.

Back in Britain, meanwhile, it was becoming clear that, as if Sinclair hadn’t already had enough problems, domestic sales of the Spectrum were beginning to slow. Sinclair was still in a dominant position, owning some 40 percent of the British market. However, conventional wisdom had it that that market was becoming saturated; by late 1984 most of the people in Britain who were likely to buy a computer had already done so, to the tune of far more sales per capita than any other country on the planet. Sinclair’s only chance to maintain sales would seem to be to sell new machines to those who already owned older models. Yet they had pissed away the time and resources needed to create a next-generation Speccy on the QL. In desperation they rushed out something called the Spectrum Plus for Christmas 1984: a slightly more substantial-looking Spectrum with a better keyboard like that of the QL (still not a genuinely good one, of course; “Sinclair trademark” and all that). With no changes to its actual computing capabilities, this wasn’t exactly a compelling upgrade package for current Spectrum owners. And, Sinclair still being Sinclair, the same old problems continued; most Spectrum Pluses arrived with several of the vaunted new plastic keys floating around loose in the box.

By mid-1985, Sinclair’s position wasn’t a whole lot better than that of Acorn. They were drowning in unsold inventories of Spectrums and QLs dating back to the previous Christmas season and even before, mired in debt, and without the resources to develop the Spectrum successor they desperately needed.

Then it seemed that their own Olivetti-equivalent had arrived. In a “World Exclusive!” article in the June 17, 1985, edition, the Daily Mirror announced that “Maxwell Saves Sinclair.” The Maxwell in question was the famous British tycoon and financier Robert Maxwell, who would inject some £12 million into the company. In return, Sir Clive would have to accept some adult supervision: he would become a “life president” and consultant, with Maxwell installing a management team of his own choosing. Everyone was relieved, even Margaret Thatcher. “The government has been aware that these talks have been going on and welcomes any move to put the Sinclair business on a firm footing,” said a spokesman.

Then, not quite two months after the carefully calibrated leak to the Daily Mirror, Maxwell suddenly scuttled the deal. We’re not quite sure why. Some have said that, after a thorough review of Sinclair’s books, Maxwell concluded the company was simply irredeemable; some that Sir Clive refused to quietly accept his “life president” post and go away the way Maxwell expected him to; some that Sir Clive planned to go away all too soon, taking with him a promising wafer-scale chip integration process a few researchers had been working on to serve as his lifeboat and bridge to yet another incarnation of an independent Sinclair, as the ZX80 had served as a bridge between the Sinclair Radionics of the 1970s and the Sinclair Research of the 1980s. Still others say that Sir Clive was never serious about the deal, that the whole process was a Machiavellian plot on his part to keep his creditors at bay until the Christmas buying season began to loom, after which they would continue to wait and see in the hope that Sinclair could sell off at least some of all that inventory before the doors were shut. This last, at least, I tend to doubt; like the idea that he staged the QL unveiling to upstage the Macintosh, it ascribes a level of guile and business acumen to Sir Clive that I’m not sure he possessed.

At any rate, Sinclair Research staggered into 1986 alive and still independent but by all appearances mortally wounded. A sign of just how far they had fallen came when they had to beg the next Spectrum iteration from some of the people they were supposed to be supplying it to: Spain’s Investrónica, signatories to the only really viable foreign distribution deal they had managed to set up. The Spectrum 128 was a manifestation of Investrónica’s impatience and frustration with their partner. After waiting years for a properly updated Spectrum, they had decided to just make their own. Created as it was quickly by a technology distributor rather than technology developer, the Spectrum 128 was a bit of a hack-and-splice job, grafting an extra 80 K of memory, an improved sound chip, and some other bits and pieces onto the venerable Speccy framework. Nevertheless, it was better than nothing, and it was compatible with older Speccy games. Sinclair Research scooped it up and started selling it in Britain as well.

The state of Acorn and Sinclair as 1986 began was enough to trigger a crisis of faith in Britain. The postwar era, and particularly the 1970s, had felt for many people like the long, slow unraveling of an economy that had once been the envy of the world. It wasn’t only Thatcher’s Conservatives who had seen Sir Clive and Acorn as standard bearers leading the way to a new Britain built on innovation and silicon. If many other areas of the economy were finally, belatedly improving after years and years of doldrums, the sudden collapse of Sinclair and Acorn nevertheless felt like a bucket of cold water to the dreamer’s face. All of the old insecurities, the old questions about whether Britain could truly compete on the world economic stage came to the fore again to a degree thoroughly out of line with what the actual economic impact of a defunct Acorn and Sinclair would have been. Now those who still clung to dreams of a silicon Britain found themselves chanting an unexpected mantra: thank God for Alan Sugar.

Sugar, the business titan with the R&B loverman’s name, had ended his formal schooling at age 16. A born salesman and wheeler and dealer, he learned his trade as an importer and wholesaler on London’s bustling Tottenham Court Road, then as now one of the densest collections of electronics shops in Europe. He founded his business, Amstrad, literally out of the back of a van there in 1968. By the late 1970s he had built Amstrad into a force to be reckoned with as purveyors of discount stereo equipment, loved by his declared target demographic of “the truck driver and his wife” as much as it was loathed by audiophiles.

He understood his target market so well because he was his target market. An unrepentant Eastender, he never tried to refine his working-class tastes, never tried to smooth away his Cockney diction despite living in a country where accent was still equated by many with destiny. The name of his company was itself a typical Cockneyism, a contraction of its original name of A.M.S. Trading Company (“A.M.S.” being Sugar’s initials). Sugar:

There was the snooty area of the public that would never buy an Amstrad hi-fi and they went out and bought Pioneer or whatever, and they’re 5 percent of the market. The other 95 percent of the market wants something that makes a noise and looks good. And they bought our stuff.

An Amstrad stereo might not be the best choice for picking out the subtle shadings of the second violin section, but it was just fine for cranking out the latest Led Zeppelin record good and loud. Sugar’s understanding of what constituted “good enough” captured fully one-third of the British stereo market for Amstrad by 1982, far more than any other single company.

In 1983, Sugar suddenly decided that Amstrad should build a home computer to compete with Sinclair, Acorn, and Commodore. Conventional wisdom would hold that this was absolutely terrible timing. Amstrad was about to jump into the market just in time for it to enter a decline. Still, if Sugar could hardly have been aware of what 1984 and 1985 would bring, he did see some fairly obvious problems with the approach of his would-be competitors which he believed Amstrad could correct. In a sense, he’d been here before.

Stereos had traditionally been sold the way that computer systems were in 1983: as mix-and-match components — an amplifier here, a tape deck and record player there, speakers in that corner — which the buyer had to purchase separately and assemble herself. One of Sugar’s greatest coups had come when he had realized circa 1978 that his truck drivers hated this approach at least as much as the audiophiles reveled in it. They hated comparing a bunch of gadgets with specifications they didn’t understand anyway; hated opening a whole pile of boxes and trying to wire everything together; hated needing four or five sockets just to power one stereo. Amstrad therefore introduced the Tower System: one box, one price, one socket — plug it in and go. It became by far their biggest seller, and changed the industry in the process.

Amstrad’s computer would follow the same philosophy, with the computer, a tape drive, and a monitor all sold as one unit. The included monitor in particular would become a marketing boon. Monitors being quite unusual in Britain, many a family was wracked with conflict every evening over whether the television was going to be used for watching TV or playing on the Speccy. The new Amstrad would, as the advertisements loudly proclaimed, make all that a thing of the past.

Amstrad CPC464

The CPC-464 computer which began shipping in June of 1984 was in many other ways a typical Amstrad creation. Sugar, who considered “boffin” a term of derision, was utterly uninterested in technological innovation for its own sake. Indeed, Sugar made it clear from the beginning that, should the CPC-464 disappoint, he would simply cut his losses and drop the product, as he had at various times televisions, CB radios, and car stereos before it. He was interested in profits, not the products which generated them. So, other than in its integrated design the CPC-464 innovated nowhere. It instead was just a solid, conservative computer that was at least in the same ballpark as the competition in every particular and matched or exceeded it in most: 64 K of memory, impressive color graphics, a decent sound chip, a more than decent BASIC. Build quality and customer service were, if not quite up to Acorn’s standards, more than a notch or two above Sinclair’s and more than adequate for a computer costing about £350 with tape drive and color monitor. Amstrad also did some very smart things to ease the machine’s path to consumer adoption: they paid several dozen programmers to have a modest library of games and other software available right from launch, and started Amstrad Computer User magazine to begin to build a community of users. These strategies, along with the commonsense value-for-your-pound approach of the machine itself, let the CPC-464 and succeeding machines do something almost inconceivable to the competitors collapsing around them: post strong sales that continued to grow by the month, making stereos a relatively minor part of Amstrad’s booming business within just a couple of years.

Amstrad’s results were so anomalous to those of the industry as a whole that for a considerable length of time the City simply refused to believe them. Their share price continued to drop through mid-1985 in direct defiance of rosy sales figures. It wasn’t until Amstrad’s fiscal year ended in June and the annual report appeared showing sales of £136.1 million and an increase in profits of 121 percent that the City finally began to accept that Amstrad computers were for real. Alan Sugar describes in his own inimitable way the triumphalism of this period of Amstrad’s history:

The usual array of predators, such as Dixons, W. H. Smith, and Boots, were hovering around like the praying mantis, saying, “Ha, ha, you’ve got too many computers, haven’t you? We’re going to jump on you and steal them off you and rape you when you need money badly, just like Uncle Clive.” And we said, “We haven’t got any.” They didn’t believe us, until such time as they had purged their stocks and finished raping Clive Sinclair and Acorn, and realized they had nothing left to sell. So they turned to us again in November of 1985 and said, “What about a few of your computers at cheaper prices?” We stuck the proverbial two fingers in the air, and that’s how we got price stability back into the market. They thought we were sitting on stockpiles and they were doing us a big favour. But we had no inventory. It had gone to France and Spain.

Continental Europe was indeed a huge key to Amstrad’s success. When Acorn and Sinclair had looked to expand internationally, they had looked to the hyper-competitive and already troubled home-computer market in the United States, an all too typical example of British Anglocentrism. (As Bill Bryson once wrote, a traveler visiting Britain with no knowledge of geography would likely conclude from the media and the conversations around her that Britain lay a few miles off the coast of the United States, perhaps about where Cuba is in our world, and it was the rest of Europe that was thousands of miles of ocean away.) Meanwhile they had all but ignored all that virgin territory that started just a ferry ride away. Alan Sugar had no such prejudices. He let America alone, instead pushing his computers into Spain, France, and the German-speaking countries (where they were initially sold under the Schneider imprint — ironically, another company that had gotten its start selling low-priced stereo equipment). Amstrad’s arrival, along with an increasingly aggressive push from Commodore’s West German subsidiary, marks the moment when home computers at last began to spread in earnest through Western Europe, to be greeted there by kids and hackers with just as much enthusiasm and talent as their British, American, and Japanese counterparts.

One day in early 1986, Alan Sugar received an unexpected call from Mark Souhami, manager of the Dixons chain of consumer-electronics stores. Souhami dropped a bombshell: it seemed that Sir Clive was interested in selling his computer operation to Amstrad, the only company left in the market with the resources for such an acquisition. Dixons, who still sold considerable numbers of Spectrums and thus had a vested interest in keeping the supply flowing, had been recruited to act as intermediaries. Sir Clive and Sugar soon met personally for a quiet lunch in Liverpool Street Station. Sir Clive later reported that he found Sugar “delightful” — “very pleasant company, a witty man.” Sugar was less gracious, ruthlessly mocking in private Sir Clive’s carefully cultivated “Etonian accent” and his intellectual pretensions.

At 3:00 AM on April 2, 1986, after several weeks of often strained negotiations, Amstrad agreed to buy all of the intellectual property for and existing stocks of Sinclair’s computers for £16 million. The sum would allow Sir Clive to pay off his creditors and make a clean break from the computer market to pursue his real passions. Tellingly, Sinclair Research itself along with the TV80 and the C5 were explicitly excluded from the transfer — not that Sugar had any interest in such financial losers anyway. With a stroke of the pen, Alan Sugar and Amstrad now owned 60 percent of the British home-computer market along with a big chunk of the exploding continental European. All less than two years after the CPC-464 had debuted under a cloud of doubt.

Clive Sinclair and Alan Sugar

When Sugar and Sir Clive officially announced their deal at a press conference on April 7, the press rightly marked it as the end of an era. The famous photograph of their uncomfortable handshake before the assembled flash bulbs stands as one of the more indelible in the history of British computing, a passing of the mantle from Sir Clive the eccentric boffin to Sugar the gruff, rough, and ruthless man of the bottom line. British computing had lost its innocence, and things would never quite be the same again. Thatcher had backed the wrong horse in choosing Sir Clive as her personification of the new British capitalist spirit. (Sugar would get a belated knighthood of his own in 2000.) On the plus side, British computing was still alive as an independent entity, a state of affairs that had looked very doubtful just the year before. Indeed, it was poised to make a huge impact yet through Amstrad.

Those who fretted that Sugar might have bought the Spectrum just to kill it needn’t have; he was far too smart and unsentimental for that. If people still wanted Spectrums, he would give them Spectrums. Amstrad thus remade the Speccy with an integrated tape drive in the CPC line’s image and continued to sell it as the low end of their lineup into the 1990s, until even the diehards had moved on. Quality and reliability improved markedly, and the thing even got a proper keyboard at long last. The QL, however, got no such treatment; Sugar put it out of its misery without a second thought.

Clive Sinclair rides off into the sunset

I’ll doubtless have more to say about a triumphant Amstrad and a humbled but still technically formidable Acorn in future articles. Sir Clive, however, will now ride off into the sunset — presumably on a C5 — to tinker with his electric cars and surface occasionally to delight the press with a crazy anecdote. He exited the computer market with dreams as grandiose as ever, but no one would ever quite take him seriously again. For a fellow who takes himself so manifestly seriously, that has to be a difficult thing to bear. Sinclair Research exists as a nominal corporation to this day, but for most of the past three decades its only actual employee appears to have been Sir Clive himself, still plugging away at his electric car (miniaturized televisions have not been in further evidence). I know I’ve been awfully hard on Sir Clive, but in truth I rather like him. He possessed arrogance, stubbornness, and shortsightedness in abundance, but no guile and very little greed. Amongst the rogue’s gallery of executives who built the international PC industry that practically qualifies him for sainthood. He was certainly the most entertaining computer mogul of all time, and he did manage almost in spite of himself to change Britain forever. The British public still has a heartfelt affection for the odd little fellow — as well they should. Eccentrics like him don’t come around every day.

(Much of this article was drawn from following the news items and articles in my favorite of the early British micro magazines, Your Computer, between January 1984 and May 1986. Other useful magazines: Popular Computing Weekly of November 10 1983 and January 12 1984; Sinclair User of November 1984, February 1985, and March 1985. Two business biographies of Sir Clive are recommended, one admiring and one critical: The Sinclair Story by Rodney Dale and Sinclair and the “Sunrise” Technology by Ian Adamson and Richard Kennedy respectively. The best account I’ve found of Amstrad’s early history is in Alan Sugar: The Amstrad Story by David Thomas. Good online articles: The Register’s features on the Sinclair Microdrives, the QL, and the Acorn Electron; Stairway to Hell’s reprinting of a series of articles on Acorn’s history from Acorn User magazine. Finally, by all means check out the delightful BBC docudrama Micro Men if you haven’t already and marvel that the events and personalities depicted therein are only slightly exaggerated. That film is also the source of the last picture in this article; it was just too perfect an image to resist.)


Tags: , , ,