RSS

Tag Archives: macintosh

Cracking Open the Mac

The Macintosh II

The biggest problem with the Macintosh hardware was pretty obvious, which was its limited expandability. But the problem wasn’t really technical as much as philosophical, which was that we wanted to eliminate the inevitable complexity that was a consequence of hardware expandability, both for the user and the developer, by having every Macintosh be identical. It was a valid point of view, even somewhat courageous, but not very practical, because things were still changing too fast in the computer industry for it to work, driven by the relentless tides of Moore’s Law.

— original Macintosh team-member Andy Hertzfeld

Jef Raskin and Steve Jobs didn’t agree on much, but they did agree on their loathing for expansion slots. The absence of slots was one of the bedrock attributes of Raskin’s original vision for the Macintosh, the most immediately obvious difference between it and Apple’s then-current flagship product, the Apple II. In contrast to Steve Wozniak’s beloved hacker plaything, Raskin’s computer for the people would be as effortless to set up and use as a stereo, a television, or a toaster.

When Jobs took over the Macintosh project — some, including Raskin himself, would say stole it — he changed just about every detail except this one. Yet some members of the tiny team he put together, fiercely loyal to their leader and his vision of a “computer for the rest of us” though they were, were beginning to question the wisdom of this aspect of the machine by the time the Macintosh came together in its final form. It was a little hard in January of 1984 not to question the wisdom of shipping an essentially unexpandable appliance with just 128 K of memory and a single floppy-disk drive for a price of $2495. At some level, it seemed, this just wasn’t how the computer market worked.

Jobs would reply that the whole point of the Macintosh was to change how computers worked, and with them the workings of the computer market. He wasn’t entirely without concrete arguments to back up his position. One had only to glance over at the IBM clone market — always Jobs’s first choice as the antonym to the Mac — to see how chaotic a totally open platform could be. Clone users were getting all too familiar with the IRQ and memory-address conflicts that could result from plugging two cards that were determined not to play nice together into the same machine, and software developers were getting used to chasing down obscure bugs that only popped up when their programs ran on certain combinations of hardware.

Viewed in the big picture, we could actually say that Jobs was prescient in his determination to stamp out that chaos, to make every Macintosh the same as every other, to make the platform in general a thoroughly known quantity for software developers. The norm in personal computing as most people know it — whether we’re talking phones, tablets, laptops, or increasingly even desktop computers — has long since become sealed boxes of one stripe or another. But there are some important factors that make said sealed boxes a better idea now than they were back then. For one thing, the pace of hardware and software development alike has slowed enough that a new computer can be viable just as it was purchased for ten years or more. For another, prices have come down enough that throwing a device away and starting over with a new one isn’t so cost-prohibitive as it once was. With personal computers still exotic, expensive machines in a constant state of flux at the time of the Mac’s introduction, the computer as a sealed appliance was a vastly more problematic proposition.

Determined to do everything possible to keep users out of the Macintosh's innards, Apple used Torx screws, which were almost unheard of at the time, and even threatened them with electrocution should they persist. The contrast with the Apple II, whose top could be popped in seconds, could hardly have been more striking.

Determined to do everything possible to keep users out of the Mac’s innards, Apple used Torx screws for which screwdrivers weren’t commonly available to seal it, and even threatened users with electrocution should they persist in trying to open it. The contrast with the Apple II, whose top could be popped in seconds using nothing more than a pair of hands to reveal seven tempting expansion slots, could hardly have been more striking.

It was the early adopters who spotted the potential in that first slow, under-powered Macintosh, the people who believed Jobs’s promise that the machine’s success or failure would be determined by the number who bought it in its first hundred days on the market, who bore the brunt of Apple’s decision to seal it as tightly as Fort Knox. When Apple in September of 1984 released the so-called “Fat Mac” with 512 K of memory, the quantity that in the opinion of just about everyone — including most of those at Apple not named Steve Jobs — the machine should have shipped with in the first place, owners of the original model were offered the opportunity to bring their machines to their dealers and have them retro-fitted to the new specifications for $995. This “deal” sparked considerable outrage and even a letter-writing campaign that tried to shame Apple into bettering the terms of the upgrade. Disgruntled existing owners pointed out that their total costs for a 512 K Macintosh amounted to $3490, while a Fat Mac could be bought outright by a prospective new member of the Macintosh fold for $2795. “Apple should have bent over backward for the people who supported it in the beginning,” said one of the protest’s ringleaders. “I’m never going to feel the same about Apple again.” Apple, for better or for worse never a company that was terribly susceptible to such public shaming, sent their disgruntled customers a couple of free software packages and told them to suck it up.

The Macintosh Plus

The Macintosh Plus

Barely fifteen months later, when Apple released the Macintosh Plus with 1 MB of memory among other advancements, the merry-go-round spun again. This time the upgrade would cost owners of the earlier models over $1000, along with lots of downtime while their machines sat in queues at their dealers. With software developers rushing to take advantage of the increased memory of each successive model, dedicated users could hardly stand to regard each successive upgrade as optional. As things stood, then, they were effectively paying a service charge of about $1000 per year just to remain a part of the Macintosh community. Owning a Mac was like owning a car that had to go into the shop for a week for a complete engine overhaul once every year. Apple, then as now, was famous for the loyalty of their users, but this was stretching even that legendary goodwill to the breaking point.

For some time voices within Apple had been mumbling that this approach simply couldn’t continue if the Macintosh was to become a serious, long-lived computing platform; Apple simply had to open the Mac up, even if that entailed making it a little more like all those hated beige IBM clones. During the first months after the launch, Steve Jobs was able to stamp out these deviations from his dogma, but as sales stalled and his relationship with John Sculley, the CEO he’d hand-picked to run the company he’d co-founded, deteriorated, the grumblers grew steadily more persistent and empowered.

The architect of one of the more startling about-faces in Apple’s corporate history would be Jean-Louis Gassée, a high-strung marketing executive newly arrived in Silicon Valley from Apple’s French subsidiary. Gassée privately — very privately in the first months after his arrival, when Jobs’s word still was law — agreed with many on Apple’s staff that the only way to achieve the dream of making the Macintosh into a standard to rival or beat the Intel/IBM/Microsoft trifecta was to open the platform. Thus he quietly encouraged a number of engineers to submit proposals on what direction they would take the platform in if given free rein. He came to favor the ideas of Mike Dhuey and Brian Berkeley, two young engineers who envisioned a machine with slots as plentiful and easily accessible as those of the Apple II or an IBM clone. Their “Little Big Mac” would be based around the 32-bit Motorola 68020 chip rather than the 16-bit 68000 of the current models, and would also sport color — another Jobsian heresy.

In May of 1985, Jobs made the mistake of trying to recruit Gassée into a rather clumsy conspiracy he was formulating to oust Sculley, with whom he was now in almost constant conflict. Rather than jump aboard the coup train, Gassée promptly blew the whistle to Sculley, precipitating an open showdown between Jobs and Sculley in which, much to Jobs’s surprise, the entirety of Apple’s board backed Sculley. Stripped of his power and exiled to a small office in a remote corner of Apple’s Cupertino campus, Jobs would soon depart amid recriminations and lawsuits to found a new venture called NeXT.

Gassée’s betrayal of Jobs’s confidence may have had a semi-altruistic motivation. Convinced that the Mac needed to open up to survive, perhaps he concluded that that would only happen if Jobs was out of the picture. Then again, perhaps it came down to a motivation as base as personal jealousy. With a penchant for leather and a love of inscrutable phraseology — “the Apple II smelled like infinity” is a typical phrase from his manifesto The Third Apple, “an invitation to voyage into a region of the mind where technology and poetry exist side by side, feeding each other” — Gassée seemed to self-consciously adopt the persona of a Gallic version of Jobs himself. But regardless, with Jobs now out of the picture Gassée was able to consolidate his own power base, taking over Jobs’s old role as leader of the Macintosh division. He went out and bought a personalized license plate for his sports car: “OPEN MAC.”

Coming some four months after Jobs’s final departure, the Mac Plus already included such signs of the changing times as a keyboard with arrow keys and a numeric keypad, anathema to Jobs’s old mouse-only orthodoxy. But much, much bigger changes were also well underway. Apple’s 1985 annual report, released in the spring of 1986, dropped a bombshell: a Mac with slots was on the way. Dhuey and Berkeley’s open Macintosh was now proceeding… well, openly.

The Macintosh II

The Macintosh II

When it debuted five months behind schedule in March of 1987, the Macintosh II was greeted as a stunning but welcome repudiation of much of what the Mac had supposedly stood for. In place of the compact all-in-one-case designs of the past, the new Mac was a big, chunky box full of empty space and empty slots — six of them altogether — with the monitor an item to be purchased separately and perched on top. Indeed, one could easily mistake the Mac II at a glance for a high-end IBM clone; its big, un-stylish case even included a cooling fan, an item that placed even higher than expansion slots and arrow keys on Steve Jobs’s old list of forbidden attributes.

Apple’s commitment to their new vision of a modular, open Macintosh was so complete that the Mac II didn’t include any on-board video at all; the buyer of the $6500 machine would still have to buy the video card of her choice separately. Apple’s own high-end video card offered display capabilities unprecedented in a personal computer: a palette of over 16 million colors, 256 of them displayable onscreen at any one time at resolutions as high as 640 X 480. And, in keeping with the philosophy behind the Mac II as a whole, the machine was ready and willing to accept a still more impressive graphics card just as soon as someone managed to make one. The Mac II actually represented colors internally using 48 bits, allowing some 281 trillion different shades. These idealized colors were then translated automatically into the closest approximations the actual display hardware could manage. This fidelity to the subtlest vagaries of color would make the Mac II the favorite of people working in many artistic and image-processing fields, especially when those aforementioned even better video cards began to hit the market in earnest. Even today no other platform can match the Mac in its persnickety attention to the details of accurate color reproduction.

Some of the Mac II's capabilities truly were ahead of their time. Here we see a desktop extended across two monitors, each powered by a separate video card.

Some of the Mac II’s capabilities truly were ahead of their time. Here we see a desktop extended across two monitors, each powered by its own video card.

The irony wasn’t lost on journalists or users when, just weeks after the Mac II’s debut, IBM debuted their new PS/2 line, marked by sleeker, slimmer cases and many features that would once have been placed on add-on-cards now integrated into the motherboards. While Apple was suddenly encouraging the sort of no-strings-attached hardware hacking on the Macintosh that had made their earlier Apple II so successful, IBM was trying to stamp that sort of thing out on their own heretofore open platform via their new Micro Channel Architecture, which demanded that anyone other than IBM who wanted to expand a PS/2 machine negotiate a license and pay for the privilege. “The original Mac’s lack of slots stunted its growth and forced Apple to expand the machine by offering new models,” wrote Byte. “With the Mac II, Apple — and, more importantly, third-party developers — can expand the machine radically without forcing you to buy a new computer. This is the design on which Apple plans to build its Macintosh empire.” It seemed like the whole world of personal computing was turning upside down, Apple turning into IBM and IBM turning into Apple.

The Macintosh SE

The Macintosh SE

If so, however, Apple’s empire would be a very exclusive place. By the time you’d bought a monitor, video card, hard drive, keyboard — yes, even the keyboard was a separate item — and other needful accessories, a Mac II system could rise uncomfortably close to the $10,000 mark. Those who weren’t quite flush enough to splash out that much money could still enjoy a taste of the Mac’s new spirit of openness via the simultaneously released Mac SE, which cost $3699 for a hard-drive-equipped model. The SE was a 68000-based machine that looked much like its forefathers — built-in black-and-white monitor included — but did have a single expansion slot inside its case. The single slot was a little underwhelming in comparison to the Mac II, but it was better than nothing, even if Apple did still recommend that customers take their machines to their dealers if they wanted to actually install something in it. Apple’s not-terribly-helpful advice for those needing to employ more than one expansion card was to buy an “integrated” card that combined multiple functions. If you couldn’t find a card that happened to combine exactly the functions you needed, you were presumably just out of luck.

During the final years of the 1980s, Apple would continue to release new models of the Mac II and the Mac SE, now established as the two separate Macintosh flavors. These updates enhanced the machines with such welcome goodies as 68030 processors and more memory, but, thanks to the wonders of open architecture, didn’t immediately invalidate the models that had come before. The original Mac II, for instance, could be easily upgraded from the 68020 to the 68030 just by dropping a card into one of its slots.

The Steve Jobs-less Apple, now thoroughly under the control of the more sober and pragmatic John Sculley, toned down the old visionary rhetoric in favor of a more businesslike focus. Even the engineers dutifully toed the new corporate line, at least publicly, and didn’t hesitate to denigrate Apple’s erstwhile visionary-in-chief in the process. “Steve Jobs thought that he was right and didn’t care what the market wanted,” Mike Dhuey said in an interview to accompany the Mac II’s release. “It’s like he thought everyone wanted to buy a size-nine shoe. The Mac II is specifically a market-driven machine, rather than what we wanted for ourselves. My job is to take all the market needs and make the best computer. It’s sort of like musicians — if they make music only to satisfy their own needs, they lose their audience.” Apple, everyone was trying to convey, had grown up and left all that changing-the-world business behind along with Steve Jobs. They were now as sober and serious as IBM, their machines ready to take their places as direct competitors to those of Big Blue and the clonesters.

To a rather surprising degree, the world of business computing accepted Apple and the Mac’s new persona. Through 1986, the machines to which the Macintosh was most frequently compared were the Commodore Amiga and Atari ST. In the wake of the Mac II and Mac SE, however, the Macintosh was elevated to a different plane. Now the omnipresent point of comparison was high-end IBM compatibles; the Amiga and ST, despite their architectural similarities, seldom even saw their existence acknowledged in relation to the Mac. There were some good reasons for this neglect beyond the obvious ones of pricing and parent-company rhetoric. For one, the Macintosh was always a far more polished experience for the end user than either of the other 68000-based machines. For another, Apple had enjoyed a far more positive reputation with corporate America than Commodore or Atari had even well before any of the three platforms in question had existed. Still, the nature of the latest magazine comparisons was a clear sign that Apple’s bid to move the Mac upscale was succeeding.

Whatever one thought of Apple’s new, more buttoned-down image, there was no denying that the market welcomed the open Macintosh with a matching set of open arms. Byte went so far as to call the Mac II “the most important product that Apple has released since the original Apple II,” thus elevating it to a landmark status greater even than that of the first Mac model. While history hasn’t been overly kind to that judgment, the fact remains that third-party software and hardware developers, who had heretofore been stymied by the frustrating limitations of the closed Macintosh architecture, burst out now in myriad glorious ways. “We can’t think of everything,” said an ebullient Jean-Louis Gassée. “The charm of a flexible, open product is that people who know something you don’t know will take care of it. That’s what they’re doing in the marketplace.” The biannual Macworld shows gained a reputation as the most exciting events on the industry’s calendar, the beat to which every journalist lobbied to be assigned. The January 1988 show in San Francisco, the first to reflect the full impact of Apple’s philosophical about-face, had 20,000 attendees on its first day, and could have had a lot more than that had there been a way to pack them into the exhibit hall. Annual Macintosh sales more than tripled between 1986 and 1988, with cumulative sales hitting 2 million machines in the latter year. And already fully 200,000 of the Macs out there by that point were Mac IIs, an extraordinary number really given that machine’s high price. Granted, the Macintosh had hit the 2-million mark fully three years behind the pace Steve Jobs had foreseen shortly after the original machine’s introduction. But nevertheless, it did look like at least some of the more modest of his predictions were starting to come true at last.

An Apple Watch 27 years before its time? Just one example of the extraordinary innovation of the Macintosh market was the WristMac from Ex Machina, a "personal information manager" that could be synchronized with a Mac to function as your appointment calendar and a telephone Rolodex among other possibilities.

An Apple Watch 27 years before its time? Just one example of the extraordinary innovation of the Macintosh market was the WristMac from Ex Machina, a “personal information manager” that could be synchronized with a Mac to take the place of your appointment calendar, to-do list, and Rolodex.

While the Macintosh was never going to seriously challenge the IBM standard on the desks of corporate America when it came to commonplace business tasks like word processing and accounting, it was becoming a fixture in design departments of many stripes, and the staple platform of entire niche industries — most notably, the publishing industry, thanks to the revolutionary combination of Aldus PageMaker (or one of the many other desktop-publishing packages that followed it) and an Apple LaserWriter printer (or one of the many other laser printers that followed it). By 1989, Apple could claim about 10 percent of the business-computing market, making them the third biggest player there after IBM and Compaq — and of course the only significant player there not running a Microsoft operating system. What with Apple’s premium prices and high profit margins, third place really wasn’t so bad, especially in comparison with the moribund state of the Macintosh of just a few years before.

Steve Jobs and John Sculley in happier times.

Steve Jobs and John Sculley in happier times.

So, the Macintosh was flying pretty high as the curtain began to come down on the 1980s. It’s instructive and more than a little ironic to contrast the conventional wisdom that accompanied that success with the conventional wisdom of today. Despite the strong counterexample of Nintendo’s exploding walled garden over in the videogame-console space, the success the Macintosh had enjoyed since Apple’s decision to open up the platform was taken as incontrovertible proof that openness in terms of software and hardware alike was the only viable model for computing’s future. In today’s world of closed iOS and Android ecosystems and computing via disposable black boxes, such an assertion sounds highly naive.

But even more striking is the shift in the perception of Steve Jobs. In the late 1980s, he was loathed even by many strident Mac fans, whilst being regarded in the business and computer-industry press and, indeed, much of the popular press in general as a dilettante, a spoiled enfant terrible whose ill-informed meddling had very nearly sunk a billion-dollar corporation. John Sculley, by contrast, was lauded as exactly the responsible grown-up Apple had needed to scrub the company of Jobs’s starry-eyed hippie meanderings and lead them into their bright businesslike present. Today popular opinion on the two men has neatly reversed itself: Sculley is seen as the unimaginative corporate wonk who mismanaged Jobs’s brilliant vision, Jobs as the greatest — or at least the coolest — computing visionary of all time. In the end, of course, the truth must lie somewhere in the middle. Sculley’s strengths tended to be Jobs’s weaknesses, and vice versa. Apple would have been far better off had the two been able to find a way to continue to work together. But, in Jobs’s case especially, that would have required a fundamental shift in who these men were.

The loss among Apple’s management of that old Jobsian spirit of zealotry, overblown and impractical though it could sometimes be, was felt keenly by the Macintosh even during these years of considerable success. Only Jean-Louis Gassée was around to try to provide a splash of the old spirit of iconoclastic idealism, and everyone had to agree in the end that he made a rather second-rate Steve Jobs. When Sculley tried on the mantle of visionary — as when he named his fluffy corporate autobiography Odyssey and subtitled it “a journey of adventure, ideas, and the future” — it never quite seemed to fit him right. The diction was always off somehow, like he was playing a Silicon Valley version of Mad Libs. “This is an adventure of passion and romance, not just progress and profit,” he told the January 1988 Macworld attendees, apparently feeling able to wax a little more poetic than usual before this audience of true believers. “Together we set a course for the world which promises to elevate the self-esteem of the individual rather than a future of subservience to impersonal institutions.” (Apple detractors might note that elevating their notoriously smug users’ self-esteem did indeed sometimes seem to be what the company was best at.)

It was hard not to feel that the Mac had lost something. Jobs had lured Sculley from Pepsi because the latter was widely regarded as a genius of consumer marketing; the Pepsi Challenge, one of the most iconic campaigns in the long history of the cola wars, had been his brainchild. And yet, even before Jobs’s acrimonious departure, Sculley, bowing to pressure from Apple’s stockholders, had oriented the Macintosh almost entirely toward taking on the faceless legions of IBM and Compaq that dominated business computing. Consumer computing was largely left to take care of itself in the form of the 8-bit Apple II line, whose final model, the technically impressive but hugely overpriced IIGS, languished with virtually no promotion. Sculley, a little out of his depth in Silicon Valley, was just following the conventional wisdom that business computing was where the real money was. Businesspeople tended to be turned off by wild-eyed talk of changing the world; thus Apple’s new, more sober facade. And they were equally turned off by any whiff of fun or, God forbid, games; thus the old sense of whimsy that had been one of the original Mac’s most charming attributes seemed to leach away a little more with each successive model.

Those who pointed out that business computing had a net worth many times that of home computing weren’t wrong, but they were missing something important and at least in retrospect fairly obvious: namely, the fact that most of the companies who could make good use of computers had already bought them by now. The business-computing industry would doubtless continue to be profitable for many and even to grow steadily alongside the economy, but its days of untapped potential and explosive growth were behind it. Consumer computing, on the other hand, was still largely virgin territory. Millions of people were out there who had been frustrated by the limitations of the machines at the heart of the brief-lived first home-computer boom, but who were still willing to be intrigued by the next generation of computing technology, still willing to be sold on computers as an everyday lifestyle accessory. Give them a truly elegant, easy-to-use computer — like, say, the Macintosh — and who knew what might happen. This was the vision Jef Raskin had had in starting the ball rolling on the Mac back in 1979, the one that had still been present, if somewhat obscured even then by a high price, in the first released version of the machine with its “the computer for the rest of us” tagline. And this was the vision that Sculley betrayed after Jobs’s departure by keeping prices sky-high and ignoring the consumer market.

“We don’t want to castrate our computers to make them inexpensive,” said Jean-Louis Gassée. “We make Hondas, we don’t make Yugos.” Fair enough, but the Mac was priced closer to Mercedes than Honda territory. And it was common knowledge that Apple’s profit margins remained just about the fattest in the industry, thus raising the question of how much “castration” would really be necessary to make a more reasonably priced Mac. The situation reached almost surrealistic levels with the release of the Mac IIfx in March of 1990, an admittedly “wicked fast” addition to the product line but one that cost $9870 sans monitor or video card, thus replacing the metaphorical with the literal in Gassée’s favored comparison: a complete Mac IIfx system cost more than most actual brand-new Hondas. By now, the idea of the Mac as “the computer for the rest of us” seemed a bitter joke.

Apple was choosing to fight over scraps of the business market when an untapped land of milk and honey — the land of consumer computing — lay just over the horizon. Instead of the Macintosh, the IBM-compatible machines lurched over in fits and starts to fill that space, adopting in the process most of the Mac’s best ideas, even if they seldom managed to implement those ideas quite as elegantly. By the time Apple woke up to what was happening in the 1990s and rushed to fill the gap with a welter of more reasonably priced consumer-grade Macs, it was too late. Computing as most Americans knew it was exclusively a Wintel world, Macs incompatible, artsy-fartsy oddballs. All but locked out of the fastest-growing sectors of personal computing, the very sectors the Macintosh had been so perfectly poised to absolutely own, Apple was destined to have a very difficult 1990s. So difficult, in fact, that they would survive the decade’s many lows only by the skin of their teeth.

This cartoon by Tom Meyer, published in the San Francisco Chronicle, shows the popular consensus about Apple by the early 1990s -- increasingly: overpriced inelegant designs and increasingly clueless management.

This cartoon by Tom Meyer, published in the San Francisco Chronicle, shows the emerging new popular consensus about Apple by the early 1990s: increasingly overpriced, bloated designs and increasingly clueless management.

Now that the 68000 Wars have faded into history and passions have cooled, we can see that the Macintosh was in some ways almost as ill-served by its parent company as was the Commodore Amiga by its. Apple’s management in the post-Jobs era, like Commodore’s, seemed in some fundamental way not to get the very creation they’d unleashed on the world. And so, as with the Amiga, it was left to the users of the Macintosh to take up the slack, to keep the vision thing in the equation. Thankfully, they did a heck of a job with that. Something in the Mac’s DNA, something which Apple’s new sobriety could mask but never destroy, led it to remain a hotbed of inspiring innovations that had little to do with the nuts and bolts of running a day-to-day business. Sometimes seemingly in spite of Apple’s best efforts, the most committed Mac loyalists never forgot the Jobsian rhetoric that had greeted the platform’s introduction, continuing to see it as something far more compelling and beautiful than a tool for business. A 1988 survey by Macworld magazine revealed that 85 percent of their readers, the true Mac hardcore, kept their Macs at home, where they used them at least some of the time for pleasure rather than business.

So, the Mac world remained the first place to look if you wanted to see what the artists and the dreamers were getting up to with computers. We’ve already seen some examples of their work in earlier articles. In the course of the next few, we’ll see some more.

(Sources: Amazing Computing of February 1988, April 1988, May 1988, and August 1988; Info of July/August 1988; Byte of May 1986, June 1986, November 1986, April 1987, October 1987, and June 1990; InfoWorld of November 26 1984; Computer Chronicles television episodes entitled “The New Macs,” “Macintosh Business Software,” “Macworld Special 1988,” “Business Graphics Part 1,” “Macworld Boston 1988,” “Macworld San Francisco 1989,” and “Desktop Presentation Software Part 1”; the books West of Eden: The End of Innocence at Apple Computer by Frank Rose, Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Computer Company by Owen W. Linzmayer, and Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything by Steven Levy; Andy Hertzfeld’s website Folklore.)

 
42 Comments

Posted by on September 16, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags:

Will Wright’s City in a Box

Will Wright, 1990

Will Wright, 1990

In “The Seventh Sally,” a story by the great Polish science-fiction writer Stanislaw Lem, a god-like “constructor” named Trurl comes upon a former tyrant named Excelsius, now exiled to a lonely asteroid by the peoples of the planets he used to terrorize. Upon learning of Trurl’s powers, Excelsius demands that he restore him to his throne. Trurl, however, is wise enough to consider what suffering Excelsius’s reinstatement would bring to his subjects. So, he instead fashions an intricate simulacrum of a kingdom for Excelsius to rule over.

And all of this, connected, mounted, and ground to precision, fit into a box, and not a very large box, but just the size that could be carried about with ease. This Trurl presented to Excelsius, to rule and have dominion over forever; but first he showed him where the input and output of his brand-new kingdom were, and how to program wars, quell rebellions, exact tribute, collect taxes, and also instructed him in the critical points and transition states of that microminiaturized society — in other words the maxima and minima of palace coups and revolutions — and explained everything so well that the king, an old hand in the running of tyrannies, instantly grasped the directions and, without hesitation, while the constructor watched, issued a few trial proclamations, correctly manipulating the control knobs, which were carved with imperial eagles and regal lions. These proclamations declared a state of emergency, martial law, a curfew, and a special levy. After a year had passed in the kingdom, which amounted to hardly a minute for Trurl and the king, by an act of the greatest magnanimity — that is, by a flick of the finger at the controls — the king abolished one death penalty, lightened the levy, and deigned to annul the state of emergency, whereupon a tumultuous cry of gratitude, like the squeaking of tiny mice lifted by their tails, rose up from the box, and through its curved glass cover one could see, on the dusty highways and along the banks of lazy rivers that reflected the fluffy clouds, the people rejoicing and praising the great and unsurpassed benevolence of their sovereign lord.

And so, though at first he had felt insulted by Trurl’s gift, in that the kingdom was too small and very like a child’s toy, the monarch saw that the thick glass lid made everything inside seem large; perhaps too he duly understood that size was not what mattered here, for government is not measured in meters and kilograms, and emotions are somehow the same, whether experienced by giants or dwarfs — and so he thanked the constructor, if somewhat stiffly. Who knows, he might even have liked to order him thrown in chains and tortured to death, just to be safe — that would have been a sure way of nipping in the bud any gossip about how some common vagabond tinkerer presented a mighty monarch with a kingdom. Excelsius was sensible enough, however, to see that this was out of the question, owing to a very fundamental disproportion, for fleas could sooner take their host into captivity than the king’s army seize Trurl. So with another cold nod, he stuck his orb and scepter under his arm, lifted the box kingdom with a grunt, and took it to his humble hut of exile. And as blazing day alternated with murky night outside, according to the rhythm of the asteroid’s rotation, the king, who was acknowledged by his subjects as the greatest in the world, diligently reigned, bidding this, forbidding that, beheading, rewarding — in all these ways incessantly spurring his little ones on to perfect fealty and worship of the throne.

When first published in 1965, Lem’s tale was the most purely speculative of speculative fictions, set as it was thousands if not millions of years in the future. Yet it would take just another quarter of a century before real-world Excelsiuses got the chance to play with little boxed kingdoms of their own, nurturing their subjects and tormenting them as the mood struck. The new strain of living, dynamic worlds filled with apparently living, dynamic beings was soon given the name of “god game” to distinguish it from the more static games of war and grand strategy that had preceded it.

The first of the great god-game constructors, the one whose name would always be most associated with the genre, was a hyperactive chain-smoking, chain-talking Southerner named Will Wright. This is the story of him and his first living world — or, actually, living city — in a box.


 

Will Wright has always been a constructor. As a boy in the 1960s and 1970s, he built hundreds of models of ships, cars, and planes. At age 10, he made a replica of the bridge of the Enterprise out of balsa wood and lugged it to a Star Trek convention; it won a prize there, the first of many Wright would get to enjoy during his life. When developments in electronics miniaturization made it possible, he started making his creations move, constructing primitive robots out of Lego bricks, model kits, and the contents of his local Radio Shack’s wall of hobbyist doodads. In 1980, the 20-year-old Wright and his partner Rick Doherty won the U.S. Express, an illegal coast-to-coast automobile race created by the organizer of the earlier Cannonball Run. A fighter jet’s worth of electronics allowed them to drive from New York City to Santa Monica in 33 hours and 39 minutes in a Mazda RX-7, cruising for long stretches of time at 120 miles per hour.

Wright was able to indulge these passions and others thanks to his late father, a materials engineer who invented a lucrative new process for manufacturing plastic packaging before dying of leukemia when his son was just 9 years old. His widow was very patient with her eccentric tinkerer of a son, similar in some ways to his practical-minded father but in others very different. Wright spent five years at various universities in and out of his home state of Louisiana, excelling in the subjects that caught his fancy — like architecture, economics, mechanical engineering, and military history — while ignoring entirely all the others. Through it all, his mother never put any undue pressure on him to settle on something, buckle down, and get an actual degree. When he told her in no uncertain terms that he wouldn’t be taking over the family business his father had left in trust for him, she accepted that as well. Yet even she must have struggled to accept the notion of her 22-year-old son running off to California with Joell Jones, a painter 11 years his senior; the two had bonded when Jones severed a nerve in her wrist and Wright built a gadget out of metal and rubber bands to allow her to continue to paint. The two would marry in 1984.

Given his love for electronic gadgetry, it will likely come as no surprise that Wright was snared quickly by the nascent PC revolution. Already by 1980 he had added an Apple II to his collection of toys, and with it computer programming and computer gaming to his long list of hobbies; his first computerized love was Bruce Artwick’s primitive original Flight Simulator. But it was only after moving to Oakland with Jones that he started thinking seriously about writing a game of his own. This first and arguably last entirely practical, commercial project of his life was apparently prompted by his now living permanently away from home, an adult at last. At some point even a dreamer has to do something with his life, and making computer games seemed as good a choice as any.

His first game was in some ways the antithesis of everything he would do later: a conventional experience in a proven genre, a game designed to suit the existing market rather than a game designed to create its own new market, and the only Will Wright game that can actually be won in the conventional sense. Like many games of its era, its design was inspired by a technical trick. Wright, who had moved on from his Apple II to a Commodore 64 by this time, had figured out a way to scroll smoothly over what appeared to be a single huge background image. “I knew the Apple couldn’t begin to move that much in the way of graphics around the screen that quickly,” he says. “So I designed the game around that feature.”

Raid on Bungeling Bay on the Commodore 64

Raid on Bungeling Bay on the Commodore 64

Raid on Bungeling Bay owed a lot to Choplifter and a little to Beach-Head, sending you off in a futuristic helicopter to strike at the heart of the evil Bungeling Empire, returning when necessary to your home base for repairs and more ammunition. The most impressive aspect of the game, even more so than its graphical tricks, was the sophisticated modeling of the enemy forces. The Bungeling factories would turn out more advanced hardware as time went on, while your ability and need to disrupt supply lines and to monitor and attack the enemy on multiple fronts created a craving for at least a modicum of strategy as well as reflexes.

Wright sold Raid on Bungeling Bay to Brøderbund Software, who published it in 1984, whereupon it sold a reasonable if hardly overwhelming 30,000 copies on the Commodore 64. But, in contrast to so many of its peers, that wasn’t the end of the story. Hudson Soft in Japan took note of the game, paying Brøderbund and Wright for the right to make it into a cartridge for the Nintendo Entertainment System. Wright claims it sold an astonishing 750,000 copies on the NES in Japan and later the United States, giving him a steady income while he played around with the ideas that would become his next project, the one that would really make his name.

As it happened, the first project merged into the second almost seamlessly. Wright had written a tool for his own use in creating the Bungeling Empire’s cities, a little world editor that would let him scroll around a virtual space, laying down tiles to represent land and sea, factories and gun turrets. He realized at some point — perhaps after his game had shipped and yet he was still tinkering with his world inside the editor — that he found this task of creation much more compelling than the act of destruction that was actually playing the game. Might there be others who felt like him? Based on the success of Electronic Arts’s Pinball Construction Set, a program he hugely admired, he thought there just might be.

One fateful day Wright shared his world editor and his still half-baked ideas about what to do with it with his neighbor Bruce Joffe. An established architect and urban planner, Joffe had studied under Jay Wright Forrester at MIT, generally regarded as the founder of the entire field of system dynamics — i.e., using a computer to simulate a complex, dynamic reality. When he saw Wright’s little Bungeling Empire cities, Joffe was immediately reminded of Forrester’s work. He wasted no time in telling his friend that he really needed to check this guy out.

Even though the two have never to my knowledge met, Jay Wright Forrester and Will Wright were a match made in heaven; they shared much beyond the name of “Wright.” Both, to name one example, got their start in the field of simulation with a flight simulator, Jay Wright Forrester trying to build one and Will Wright trying to figure out how Bruce Artwick’s Flight Simulator really worked.

Driven by his desire to make a flight simulator, Forrester had been instrumental in the creation of Whirlwind, the first real computer, in the sense that we understand the term today, to be built in the United States. [1]The more canonical example in American textbooks, the ENIAC, could only be “programmed” by physically rewiring its internals. It’s probably better understood as an elaborate calculating machine than a true computer; its original purpose was to calculate static artillery firing tables. As in so many things, politics plays a role in ENIAC’s anointment. The first computer programmable entirely in software, pre-dating even Whirlwind, was EDSAC-1, built at Cambridge University in Britain. That such a feat was first managed abroad seems to be just a bit more than some Americans in Silicon Valley and elsewhere can bring themselves to accept. The flight simulator never quite came together, but an undaunted Forrester moved on to Project SAGE, an air-defense early-warning system that became easily the most elaborate computing project of the 1950s. From there, he pioneered economic and industrial modeling on computers, and finally, in the late 1960s, arrived at what he called “urban dynamics.” Forrester’s urban modeling created a firestorm of controversy among city planners and social activists; as he put it in his dry way, it “was the first of my modeling work that produced strong, emotional reactions.” He was accused of everything from incompetence to racism when his models insisted that low-cost urban public housing, heretofore widely regarded as a potent tool for fighting poverty, was in reality “a powerful tool for creating poverty, not alleviating it.”

Of more immediate interest to us, however, is the reaction one Will Wright had to Forrester’s work many years after all the controversy had died away. The jacket copy of Forrester’s book Urban Dynamics reads like a synopsis of the simulation Wright was now about to create on a microcomputer: “a computer model describing the major internal forces controlling the balance of population, housing, and industry within an urban area,” which “simulates the life cycle of a city and predicts the impact of proposed remedies on the system.” When Wright’s neighbor Joffe had studied under Forrester in the 1970s, the latter had been constructing physical scale models of his urban subjects, updating them as time went on with the latest data extracted from his computer programs. If he could build a similar program to live behind his graphical Bungeling Empire cities, Wright would have found a much easier way to study the lives of cities. At about the same time that he had that initial conversation with Joffe, Wright happened to read the Stanislaw Lem story that opened this article. If he needed further inspiration to create his own city in a box, he found plenty of it there.

Never one to shy away from difficult or esoteric academic literature, Wright plunged into the arcane theoretical world of system dynamics. He wound up drawing almost as much from John Horton Conway’s 1970 Game of Life, another major landmark in the field, as he did from Forrester. Wright:

System dynamics is a way to look at a system and divide it into, basically, stocks and flows. Stocks are quantities, like population, and flows are rates, like the death rate, the birth rate, immigration. You can model almost anything using those two features. That was how he [Forrester] started system dynamics and that was the approach he took to his modeling. I uncovered his stuff when I started working on SimCity and started teaching myself modeling techniques. I also came across the more recent stuff with cellular automata [i.e., Conway’s Game of Life], and SimCity is really a hybrid of those two approaches. Because his [Forrester’s] approach was not spatial at all, whereas the cellular automata gives you a lot of really interesting spatial tools for propagation, network flow, proximity, and so forth. So the fact that pollution starts here, spreads over here, and slowly gets less and less, and you can actually simulate propagation waves through these spatial structures. So SimCity in some sense is like a big three-dimensional cellular automata, with each layer being some feature of the landscape like crime or pollution or land value. But the layers can interact on the third dimension. So the layers of crime and pollution can impact the land-value layer.

This description subtly reveals something about the eventual SimCity that is too often misunderstood. The model of urban planning that underpins Wright’s simulation is grossly simplified and, often, grossly biased to match its author’s own preexisting political views. SimCity is far more defensible as an abstract exploration of system dynamics than as a concrete contribution to urban planning. All this talk about “stocks” and “flows” illustrates where Wright’s passion truly lay. For him the what that was being simulated was less interesting than the way it was being simulated. Wright:

I think the primary goal of this [SimCity] is to show people how intertwined such things can get. I’m not so concerned with predicting the future accurately as I am with showing which things have influence over which other things, sort of a chaos introduction, where the system is so complex that it can get very hard to predict the future ramifications of a decision or policy.

After working on the idea for about six months, Wright brought a very primitive SimCity to Brøderbund, who were intrigued enough to sign him to a contract. But over the next year or so of work a disturbing trend manifested. Each time Wright would bring the latest version to Brøderbund, they’d nod approvingly as he showed all the latest features, only to ask, gently but persistently, a question Wright learned to loathe: when would he be making an actual game out of the simulation? You know, something with a winning state, perhaps with a computer opponent to play against?

Even as it was, SimCity was hardly without challenge. You had to plan and manage your city reasonably well or it would go bankrupt or drown in a sea of crime or other urban blights and you, the mayor, would get run out of town on a rail. Yet it was also true that there wasn’t a conventional winning screen to go along with all those potential losing ones. Wright tried to explain that the simulation was the game, that the fun would come from trying things out in this huge, wide-open possibility space and seeing what happened. He thought he had ample evidence from his friends that he wasn’t the only one who liked to play this way. They would dutifully build their cities to a point and then, just like Excelsius in the story, would have just as much fun tearing them down, just to see what happened. Indeed, they found the virtual destruction so enjoyable that Wright added disasters to the program — fires, earthquakes, tornadoes, even a rampaging Godzilla monster — that they could unleash at will. As with everything else in SimCity, the motivation for a player consciously choosing to destroy all her labor was just to see what would happen. After all, you could always save the game first. Wright:

When I first started showing the Commodore version, the only thing that was in there was a bulldozer, basically to erase mistakes. So if you accidentally built a road or a building in the wrong place you could erase it with the bulldozer. What I found was that, invariably, in the first five minutes people would discover the bulldozer, and they would blow up a building with it by accident. And then they would laugh. And then they would go and attack the city with the bulldozer. And they’d blow up all the buildings, and they’d be laughing their heads off. And it really intrigued me because it was like someone coming across an ant pile and poking it with a stick to see what happens. And they would get that out of their system in about ten minutes, and then they would realize that the hard part wasn’t destroying, but building it back up. And so people would have a great time destroying the city with a bulldozer, and then they would discover, “Wow, the power’s out. Wow, there’s a fire starting.” And that’s when they would start the rebuilding process, and that’s what would really hook them. Because they would realize that the destruction was so easy in this game, it was the creation that was the hard part. And this is back when all games were about destruction. After seeing that happen with so many people, I finally decided, “Well I might as well let them get it out of their systems. I’ll add disasters to the game.” And that’s what gave me the idea for the disasters menu.

Wright asked Brøderbund to look at his “game” not as a conventional zero-sum ludic experience, but as a doll house or a train set, an open-ended, interactive creative experience — or, to use the term the market would later choose, as a “sandbox” for the player. Wright:

I think it [sandbox gaming] attracts a different kind of player. In fact, some people play it very goal-directed. What it really does is force you to determine the goals. So when you start SimCity, one of the most interesting things that happens is that you have to decide, “What do I want to make? Do I want to make the biggest possible city, or the city with the happiest residents, or the most parks, or the lowest crime?” Every time you have to idealize in your head, “What does the ideal city mean to me?” It requires a bit more motivated player. What that buys you in a sense is more replayability because we aren’t enforcing any strict goal on you. We could have said, “Get your city to 10,000 people in ten years or you lose.” And you would always have to play that way. And there would be strategies to get there, and people would figure out the strategies, and that would be that. By leaving it more open-ended, people can play the game in a lot of different ways. And that’s where it becomes more like a toy.

But Brøderbund just couldn’t seem to understand what he was on about. At last, Wright and his publisher parted ways in a haze of mutual incomprehension. By the time they did so, the Commodore 64 SimCity was essentially complete; it would finally be released virtually unchanged more than two years later.

SimCity on the Commodore 64

SimCity on the Commodore 64

For the moment, though, nobody seemed interested at all. After halfheartedly shopping SimCity around to some other publishers (among them Cinemaware) without a bite, Wright largely gave up on the idea of ever getting it released. But then in early 1987, with SimCity apparently dead in the water, he was invited to a pizza party for game developers hosted by a young businessman named Jeff Braun. Braun, who envisioned himself as the next great software entrepreneur, had an ulterior motive: he was looking for the next great game idea. “Will is a very shy guy, and he was sitting by himself, and I felt sorry for him,” Braun says. In marked contrast to Brøderbund, Braun saw the appeal of SimCity before he ever even saw the program in action, as soon as a very reluctant, thoroughly dispirited Wright started to tell him about it. His interest was piqued despite Wright being far from a compelling pitchman: “Will kept saying that this won’t work, that no one likes it.”

Braun nevertheless suggested that he and Wright found their own little company to port the program from the Commodore 64 to the Apple Macintosh and Commodore Amiga, more expensive machines whose older and presumably more sophisticated buyers might be more receptive to the idea of an urban-planning simulation. Thus was Maxis Software born.

Wright ported the heart of the simulation from Commodore 64 assembler to platform-independent C while a few other programmers Braun had found developed user interfaces and graphics for the Macintosh and Amiga. The simulation grew somewhat more complex on the bigger machines, but not as much as you might think. “It got more elaborate, more layers were added, and there was higher resolution on the map,” says Wright, “but it had the same basic structure for the simulation and the same basic sets of tools.”

SimCity on the Macintosh

SimCity on the Macintosh

While Wright and the other programmers were finishing up the new versions of SimCity, Braun scared up a very surprising partner for their tiny company. He visited Brøderbund again with the latest versions, and found them much more receptive to Wright’s project this time around, a switch that Wright attributes to the generally “more impressive” new versions and the fact that by this point “the market was getting into much more interesting games.” Still somewhat concerned about how gamers would perceive Wright’s non-game, Brøderbund did convince Maxis to add a set of optional “scenarios” to the sandbox simulation, time-limited challenges the player could either meet or fail to meet, thus definitively winning or losing. The eight scenarios, some historical (the San Francisco earthquake of 1906, the fire-bombing of Hamburg in 1944), some hypothetical (a nuclear meltdown in Boston in 2010, the flooding of Rio de Janeiro in 2047 thanks to global warming), and some unabashedly fanciful (a monster attack on Tokyo in 1957), were all ultimately less compelling than they initially sounded, being all too clearly shoehorned into an engine that had never been designed for this mode of play. Still, Brøderbund’s perceived need to be able to honestly call SimCity a game was met, and that was the most important thing. Brøderbund happily agreed to become little Maxis’s distributor, a desperately needed big brother to look after them in a cutthroat industry.

SimCity

SimCity shipped for the Macintosh in February of 1989, for the Commodore 64 in April, and for the Amiga in May. Some people immediately sat up to take notice of this clearly new thing; sales were, all things considered, quite strong right out of the gate. In an online conference hosted on June 19, 1989, Wright said that they had already sold 11,000 copies of the Macintosh version and 8000 of the Amiga, big numbers in a short span of time for those relatively small American gaming markets. Presaging the real explosion of interest still to come, he noted that Maxis had had “many inquiries from universities and planning departments.” And indeed, already in August of 1989 the first academic paper on SimCity would be presented at an urban-planning conference. Realizing all too well himself how non-rigorous an exercise in urban planning SimCity really was, Wright sounded almost sheepish in contemplating “a more serious version” for the future.

SimCity for MS-DOS

SimCity for MS-DOS

SimCity would begin to sell in really big numbers that September, when the all-important MS-DOS version appeared. Ports to virtually every commercially viable or semi-viable computer in the world appeared over the next couple of years, culminating in a version for the Super Nintendo Entertainment System in August of 1991.

SimCity for Super Nintendo

SimCity for Super Nintendo

It’s at this point that our history of SimCity the private passion project must inevitably become the history of SimCity the public sensation. For, make no mistake, a public sensation SimCity most definitely became. It sold and sold and sold, and then sold some more, for years on end. In 1991, the year it celebrated its second anniversary on the market, it still managed to top the charts as the annum’s best-selling single computer game. Even five years after its release, with Wright’s belated “more serious” — or at least more complicated — version about to ship as SimCity 2000, the original was still selling so well that Maxis decided to rename it SimCity Classic and to continue to offer it alongside its more advanced variant. In that form it continued to sell for yet several more years. Shelf lives like this were all but unheard of in the fickle world of entertainment software.

In all, the original SimCity sold at least 500,000 copies on personal computers, while the Super Nintendo version alone sold another 500,000 to console gamers. Spin-offs, sequels, and derivatives added millions and millions more to those numbers in the years that followed the original’s long heyday; at no point between 1989 and today has there not been at least one SimCity title available for purchase. And, believe me, people have continued to purchase. SimCity 2000 (1994) and SimCity 3000 (1999) both became the best-selling single computer games of their respective release years, while post-millennial iterations have sold in the millions as a matter of routine.

But almost more important than the quantities in which the original SimCity sold and the veritable cottage industry it spawned are the people to whom it was selling. By the time they signed Maxis to a distribution contract, Brøderbund had long since demonstrated their knack for getting past the nerdy hardcore of computer users, for bypassing Dungeons & Dragons and military simulations and all the rest to reach the great unwashed masses of Middle America. Brøderbund’s The Print Shop and their Carmen Sandiego series in particular remain icons of ordinary American life during the 1980s. SimCity must be added to that list for the 1990s. Beginning with a June 15, 1989, piece in no less august a journal than The New York Times, seemingly every newspaper and news magazine in the country wrote about SimCity. For a mainstream media that has never known quite what to make of computer games, this was the rare game that, like Carmen Sandiego, was clearly good for you and your kids.

SimCity even penetrated into the political sphere. With a mayoral election pending in 1990, The Providence Journal set up a contest for the five candidates for the post, letting each have his way with a simulated version of Providence, Rhode Island. The winner of that contest also wound up winning the election. More amusing was the experiment conducted by Detroit News columnist Chuck Moss. He sent Godzilla rampaging through a simulated Detroit, then compared the result with the carnage wrought by Coleman Young during his two-decade real-world reign as mayor. His conclusion? Godzilla had nothing on Mayor Young.

If the interest SimCity prompted in the mainstream media wasn’t unusual enough, academia’s eagerness to jump on the bandwagon in these years long before “game studies” became an accepted area of interest is even more astonishing. Articles and anecdotes about Will Wright’s creation were almost as prevalent in the pages of psychology and urban-planning journals as they were in newspapers. Plenty of the papers in the latter journals, written though they were by professionals in their field who really should have known better, credited Wright’s experiment with an authority out of all proportion to the fairly simplistic reality of the simulation, in spite of candid admissions of its limitations from the people who knew the program best. “I wouldn’t want to predict a real city with it,” Wright said. Bruce Joffe, the urban planner who had set Wright down the road to SimCity, responded with one word when asked if he would use the program to simulate any aspect of a city he was designing in the real world: “No.” And yet SimCity came to offer perhaps the most compelling demonstration of the Eliza Effect since Joseph Weizenbaum’s simple chatbot that had given the phenomenon its name. The world, SimCity proved once again, is full of Fox Mulders. We all want to believe.

In that spirit, SimCity also found a home in a reported 10,000 elementary-, middle-, and high-school classrooms across the country, prompting Maxis to offer a new pedagogical version of the manual, focused on techniques for using the simulation as a teaching tool. And SimCity started showing up on university syllabi as well; the construction of your own simulated city became a requirement in many sociology and economics classes.

Back in May of 1989, Computer Gaming World had concluded their superlative review of SimCity — one of the first to appear anywhere in print — by asking their readers to “buy this game. We want them to make lots of money so they’ll develop SimCounty, SimState, SimNation, SimPlanet, SimUniverse… billions and billions of games!” The hyperbole proved prescient; Maxis spent the 1990s flooding the market with new Sim titles.

SimEarth on MS-DOS

SimEarth on MS-DOS

Jay Wright Forrester’s follow-up to his book Urban Dynamics had been Global Dynamics, an inquiry into the possibility of simulating the entire world as a dynamic system. Wright’s own next game, then, was 1990’s SimEarth, which attempted to do just that, putting you in charge of a planet through 10 billion years of geological and biological evolution. SimEarth became a huge success in its day, one almost comparable to SimCity. The same year-end chart that shows SimCity as the best-selling single title of 1991 has SimEarth at number two — quite a coup for Maxis. Yet, like virtually all of the later Sim efforts, SimEarth is far less fondly remembered today than is its predecessor. The ambitious planet simulator just wasn’t all that much fun to play, as even Wright himself admits today.

But then, one could make the same complaint about many of Maxis’s later efforts, which simulated everything from ant colonies to office towers, healthcare systems (!) to rain forests. New Sim games began to feel not just like failed experiments but downright uninspired, iterating and reiterating endlessly over the same concept of the open-ended “software toy” even as other designers found ways to build SimCity‘s innovations into warmer and more compelling game designs. Relying heavily as always on his readings of the latest scientific literature, Wright could perhaps have stood to put away the academic journals from time to time and crack open a good novel; he struggled to find the human dimension in his simulations. The result was a slow but steady decline in commercial returns as the decade wore on, a trend from which only the evergreen SimCity and its sequels were excepted. Not until 2000 would Maxis finally enjoy a new breakthrough title, one that would dwarf even the success of SimCity… but that is most definitely a story for another time.

Given its storied history and the passion it once inspired in so many players, playing the original SimCity as well for the first time today is all but guaranteed to be a somewhat underwhelming experience. Even allowing for what now feels like a crude, slow user interface and absurdly low-resolution graphics, everything just feels so needlessly obscure, leaving you with the supreme frustration of losing again and again without being able to figure out why you’re losing. Not for nothing was this game among the first to spawn a book-length strategy guide — in fact, two of them. You need inside information just to understand what’s going on much of the time. There are games that are of their time and games that are for all time. In my perhaps controversial opinion, the original SimCity largely falls into the former category.

But, far from negating SimCity‘s claim to our attention, this judgment only means that we, as dutiful students of history, need to try even harder to understand what it was that so many people first saw in what may strike us today as a perversely frustrating simulation. Those who played the original SimCity for the first time, like those who played the original AdventureDefender of the Crown, and a bare handful of other landmark games in the history of the hobby, felt the full shock of a genuinely new experience that was destined to change the very nature of gaming. It’s a shock we can try to appreciate today but can never fully replicate.

You can see traces of SimCity in many if not most of the games we play today, from casual social games to hardcore CRPG and strategy titles. Sid Meier, when asked in 2008 to name the three most important innovations in the history of electronic gaming, listed the invention of the IBM PC, the Nintendo Seal of Quality… and, yes, SimCity. “SimCity was a revelation to most of us game designers,” says Meier. “The idea that players enjoyed a game that was open-ended, non-combative, and emphasized construction over destruction opened up many new avenues and possibilities for game concepts.” Many years before Meier’s statement, Russell Sipe, the respected founder of Computer Gaming World, said simply that “SimCity has changed the face of computer-entertainment software.” He was and is absolutely correct. Its influence really has been that immense.

(Sources: Magazines include Amazing Computing of October 1989; Game Developer from April 2006; MacWorld from April 1990; Computer Gaming World from May 1989; Compute! from January 1992; The New Yorker from November 6 2006. Newspapers include The San Francisco Chronicle from November 3 2003; The New York Times from June 15 1989; The Los Angeles Times from October 2 1992. Books include The Cyberiad by Stanislaw Lem; The SimCity Planning Commission Handbook by Johnny L. Wilson; Game Design Theory and Practice by Richard Rouse III; The City of Tomorrow and Its Planning by Le Corbusier; The Second Self by Sherry Turkle. Current and archived online sources include John Cutter’s blog; Game Research; articles about Will Wright and Sid Meier on Wired; The Next American City; Reform; GameSpot; a 1989 talk given by Jay Wright Forrester, which is hosted at MIT; First Monday; Taylor Francis Online. And finally, there’s the collection of Brøderbund archives I went through during my visit to the Strong Museum of Play.

Beginning with SimCity 2000, the more playable later iterations of the franchise are all available for purchase in various places online. For those of an historical bent who’d like to experience the original, I offer a zip that includes the first three versions — for the Macintosh, Commodore 64, and Amiga.)

Footnotes

Footnotes
1 The more canonical example in American textbooks, the ENIAC, could only be “programmed” by physically rewiring its internals. It’s probably better understood as an elaborate calculating machine than a true computer; its original purpose was to calculate static artillery firing tables. As in so many things, politics plays a role in ENIAC’s anointment. The first computer programmable entirely in software, pre-dating even Whirlwind, was EDSAC-1, built at Cambridge University in Britain. That such a feat was first managed abroad seems to be just a bit more than some Americans in Silicon Valley and elsewhere can bring themselves to accept.
 

Tags: , , , , , ,

Cliff Johnson’s Fool’s Errand

The Fool's Errand

One sunny day, a light-hearted fool strolled along a hilly path, whistling a merry tune. A long wooden pole was slung over his shoulder and attached to it was a cloth bundle which carried his life’s possessions.

“What a marvelous afternoon!” he exclaimed to no one in particular, pausing to appreciate the lovely countryside.

Soon the trees parted and the path led to a small clearing, ending abruptly at the edge of a treacherous cliff. But the fool was undaunted and kept at his swift pace, steadily approaching the sheer drop.

“Your folly is most curious,” a voice boomed. “Have you no fear of death?”

Just as one leg dangled over the side of the cliff, the fool hesitated.

“Who dares to interrupt my errand?” he demanded impatiently.

“I dare,” the bright yellow sun replied.

“Well, then,” the fool considered, “I seek the fourteen treasures of the world and I am told that a man who strays from his path is lost.”

“That may well be true,” spoke the sun, “but I fear that you are already lost. Take this map as my gift. It will aid you in your quest.”

And in a flash of light, an aged parchment appeared at his feet.

“At last! A path to follow!” cried the fool, happily taking the map.

“Perhaps,” the sun murmured, “yet things are never as simple as they may seem.”

But the fool had already run back down the hill and did not hear the sun’s parting words.

Long before “game developer” was recognized by universities as a legitimate career to which one could aspire, people from a dizzying array of backgrounds stumbled into the field. Plenty were the programmers and technologists that you might expect, but plenty of others came from places much farther afield. Infocom, just to take one example, included among their ranks a construction engineer, a journalist, a science-fiction author, a medical doctor, and two lost literature majors, while Sierra’s two most prominent designers were a former housewife and a former jazz musician. Other companies could boast architects, psychologists, rocket scientists, poets, and plain old high-school students. Taken in this light, the story of Cliff Johnson, a filmmaker who decided to start making computer games instead, may not seem quite so atypical. The first game he made, however, is anything but typical. The Fool’s Errand is one of the precious gems of its era, despite — because of? — having been made by a fellow with little intrinsic interest in computers and even less in playing games on them. For an industry that has so often struggled to get beyond a handful of fictional and mechanical genres inspired by a tiny sliver of the rich cultural tapestry of the human race, it’s yet one more reminder of just what a good thing a little diversity can be.

Born in 1953 in Connecticut as the only child of a pair of schoolteachers, Cliff Johnson manifested creativity and curiosity almost from the moment he arrived. As a boy, he spent hours tramping around the woods and dales that surrounded his family home. He loved maps, loved to imagine himself an explorer on the hunt for hidden pirate treasure (“X marks the spot!”). When not roaming the woods, he loved making things with his own hands, whether that meant playing with an erector set or spending long afternoons in the basement doing home-grown chemistry experiments. Gunpowder was easy, the formula printed in lots of places. Chlorine gas proved more tricky, both to make and to get rid of; thankfully the basement had some windows just below its ceiling that helped Cliff get rid of the evidence before Mom and Dad made it home.

Any possibility of Cliff becoming a cartographer or a scientist was, however, derailed on the day that he saw the classic horror flick House on Haunted Hill, starring no less an icon than Vincent Price. Like millions of other kids across the country, he became a B-movie fanatic, a devotee of all things monstrous, horrific, and/or alien. But unlike most fans, Cliff’s personality demanded that he do more than passively consume his obsession. Getting his hands on a Super 8 camera, he started making his own little movies. His technique evolved with impressive speed; soon he was doing stop-motion with live actors to craft his special-effects sequences, a tricky proposition even for professionals. As he got older and his teenage tastes, relatively speaking, matured, he discovered the allure of camp, moving from pulpy horror to slapstick comedy. His magnum opus, shown in his high-school auditorium on three glorious evenings, was called The Return of the Freshman. (It was, naturally, a sequel, and one with a name that beat George Lucas to the punch by thirteen years at that.)

Cliff Johnson

The summer before, while The Return of the Freshman was still in its planning stages, Cliff and his parents had visited Disneyland. He was no stranger to amusement parks, but knew them as the seedy, vaguely disreputable places they generally were at that time, echoes of the still older traveling circuses. Disneyland, however, was something different. In addition to being clean and wholesome and family-friendly, care had been taken with their rides and other attractions to turn them into real experiences. Cliff was particularly entranced by the lovingly sculpted animatronic characters who actually moved. “I could do that!” was his boyish response; after all, he’d gotten quite good at sculpting monsters and props for his movies. Back home, the local amusement park, a run-down place called Lake Compounce, had a ride called Laff-in-the-Dark that had fallen on hard times. Once full of chills and thrills, its bits and pieces had broken down and been removed one by one, so that now it was largely just a ride through a pitch-black tunnel. Cliff asked the Norton family that ran the park for permission to walk through the ride while it was closed, measuring its every dimension, sketching its every curve and straightaway. He and his girlfriend Janice then made models and sketches illustrating how they thought the ride could be restored to its former glory. Showing an audacity that would serve him well throughout his life, Cliff formally proposed their services to the Nortons. For $1000, they would bring a little taste of Disneyland to Lake Compounce. The Norton family agreed, and thus, between shoots for The Return of the Freshman, Cliff along with cast and crew and Janice built monsters and lights and installed them in the ride. The Norton patriarch, also the mayor of the city of Bristol at that time, was so thrilled with Cliff’s work that he agreed to appear in his movie. He played himself, looking out of his window at City Hall at a flying saucer whizzing by. (“What a sport!” remarks Cliff today.)

Cliff did such a good job on that hometown ride that word got out on the amusement-park circuit about this talented teenager who, being a teenager, worked pretty cheap. He spent the next few years traveling the country as far from home as Colorado and California, making monsters for low-rent amusement parks and saving money for his dream of attending film school right in the heart of Hollywood, at the University of Southern California.

He finally began USC Film School in 1974. University worked on him just as it ideally ought to, opening new intellectual vistas. Having entered an aficionado of monster movies and Disney, perhaps primed for a career in Hollywood special effects — and at a good time too, what with George Lucas and Steven Spielberg just around the corner — he discovered there a new world of film, film made as art for art’s sake. More specifically, he discovered avant-garde animation. Working under Professor Gene Coe, a semi-legendary figure at USC, he made a number of experimental films that did very well on the festival circuit, earning several prizes. He still remembers his years at USC as some of the best and most artistically productive of his life.

But then, in 1979, he was suddenly finished, with a freshly minted Master’s Degree in his hand and no clear idea of what to do next. Virtually the entire avant-garde animation industry, to the extent that such a thing existed at all, existed in Canada and Europe rather than the United States. Cliff couldn’t see himself moving there, but he also no longer had any desire to become a part of the mainstream Hollywood machine that surrounded him in Southern California. So, he became an entrepreneur, an all-purpose filmmaker-for-hire who served a client list consisting mostly of big corporations that needed films made for practical in-house purposes and didn’t want to pay too much for the service. Cliff, by now an accomplished animator as well as cameraman, could do entire films virtually by himself, adding charts and graphics and titles and cartoons and whatever else was needed to his live-action footage to meet his clients’ needs. He did a large number of training films for Southern California Edison in particular, producing, as he would later put it, “such notable works as Heating, Air Conditioning and Ventilation and other film classics.” Yes, it was pretty boring stuff, but it was a pretty decent living.

And yet the fact remained that his new life was as much of a creative comedown from his art-for-art’s-sake days at USC as it was a financial windfall. An artistically stymied Cliff thus began to look around for diversions from his dull working life. Three influences in particular came together at this time to start him in earnest down the road toward The Fool’s Errand.

First, there were his parties. As far back as his teenage years, he had loved to throw elaborate multimedia parties — multimedia in the old sense of the word, implying the blending of different sorts of media in a physical rather than a digital space. He would fill the rooms of a house with lights, sound, music, film, figures, and props, and arrange his guests’ progress through the house so as to tell a little story or illustrate a theme. Soon he began adding an interactive element, a little puzzle or mystery for his guest to solve. These mysteries were, he notes wryly today, “more jobs for Watson than Sherlock,” but, despite or because of their simplicity, his guests really took to them. He kept throwing the parties, growing ever more elaborate all the while, through his time at USC and especially while working as a filmmaker-for-hire, when he desperately needed the creative outlet they provided.

Next, there was Masquerade. Soon after leaving USC, Cliff became one of countless thousands of people all over the world to be fascinated by a little children’s picture book called Masquerade. Written by a reclusive English painter named Kit Williams, it tells the story of a hare who carries a treasure from the Moon to the Sun, with fifteen of Williams’s intricate paintings as illustrations. What so enthralled Cliff and all those others, however, was the end of the book, which reveals that the hare has lost his treasure, and that it’s up to you, the reader, to find from clues scattered through the book where it now lies in the real world. Masquerade became an international phenomenon that obsessed treasure hunters and puzzle solvers for more than two-and-a-half years, selling nearly 2 million books in the process. Cliff, who didn’t personally enjoy solving puzzles all that much, was perhaps less obsessed than many of those buyers, but he found the idea of Masquerade, of a book that could stand alone but could also serve as a grand puzzle, endlessly intriguing as something he might create. [1]For better and often for worse, Masquerade‘s connection to computer gaming extends far beyond the story of Cliff Johnson. The man who first “solved” the riddle and was awarded the hare in March of 1982, one Dugald Thompson, did so, it was revealed years later, largely by cheating. A friend of his had as his current girlfriend a former girlfriend of Kit Williams. While Kit had never outright told her where the treasure — a golden hare sculpted by Kit himself — lay, he had inadvertently dropped enough clues that she could point Thompson in the right direction.

After he was awarded the prize, Thompson formed a company called Haresoft to release an unspeakably horrible computer game called Hareraiser that supposedly contained further clues pointing to a new location of the golden hare. If it did, nobody ever worked them out. More likely, the game was yet another instance of fraud committed by Thompson, designed to make a quick buck from the players that it sent off on a wild goose chase after its nonexistent clues. It justly bombed, Haresoft went into liquidation, and Thompson was forced by his creditors to sell the golden hare at auction.

Long before those events, Masquerade had inspired other British game developers to embed real-world treasure hunts and other puzzles in their own games, perhaps most notably the hunt for the “Golden Sundial of Pi” buried in the Sussex Downs by Mel Croucher and Christian Penfold of Automata. All told, that golden hare had one hell of a long reach.

And finally, there was the tarot deck. Not remotely of a spiritual or supernatural bent, Cliff nevertheless came upon a lovely Rider-Waite tarot deck and found himself fascinated with the characters and ideas represented there.

A section of the Fool's Errand treasure map.

A section of the Fool’s Errand treasure map.

All of these influences merged together in a 1981 project that would prove to be a non-computerized prototype of the final version of The Fool’s Errand that was still six years in the future. Wanting to create another fun and unique experience for his friends as a Christmas gift, like all those themed parties, Cliff decided to write, just for them, a little book much like Masquerade, telling the story of a Fool who wanders through a fairy-tale land based loosely on the world of the tarot. Filling the back of the book, after the 21-page story, were another 14 pages containing pieces of a treasure map; shades of Cliff’s childhood roaming the Connecticut woodlands dreaming of pirate maps and buried treasure. The player should cut out and assemble the pieces using clues from the story, which was divided into 81 sections, each relating to one piece of the map. Accomplishing that got you to an endgame, a crossword requiring you to correctly place the names of 13 treasures mentioned in the story to decode a final message: “Merry Christmas!” (Those of you who’ve already played The Fool’s Errand on a computer will recognize all of this as essentially the second half of that game, the part you embark on after completing all of the initial puzzles.) Unlike Masquerade, over which so many puzzlers fretted for years, The Fool’s Errand was designed to be a pleasant challenge but not an exhausting one, solvable in a single long, lazy holiday afternoon. He was thus disappointed when, out of the dozens of people to whom he sent the book, only three actually solved it. The lesson he took away was that, while he believed his friends to be a pretty intelligent group on the whole, this sort of complex puzzle required a special kind of intelligence — or, perhaps better said, a special kind of personality — that made it a poor fit for most of them. He put his storybook back on the shelf, and returned to the themed parties that were so much better received.

But Cliff’s reputation among his friends as a devious mind was now established, and would lead to his introduction to the brave new world of computerized puzzle design. One of his friends, Allen Pinero, came to him with a proposition. Pinero had jumped onto the home-computer bandwagon early, purchasing an Apple II, and had devised a unique text-adventure engine that let the player control two characters at once through a split-screen view. With the core programming in place, though, he was having some trouble devising a plot and puzzles — in short, something for the characters to actually do. Despite knowing nothing about the state of the art in home computers, much less adventure games — he’d never played or even seen one in his life before Pinero showed him a few of Scott Adams’s to prime his pump — Cliff came up with a plot that tangled together several stories from Greek mythology; the two characters under the player’s control became none other than Jason and Hercules. He also devised a batch of puzzles that often required the characters to work together from different rooms, and to illustrate their adventures he drew pictures freehand, which Pinero than translated into vector graphics on the screen. Released in late 1982 by Scott Adams’s Adventure International, Labyrinth of Crete, like most Adventure International games by that time, made little impact in either the trade press or at retail, although it did sell well enough through their mail-order catalogs that they funded ports to the Atari 8-bits and the Commodore 64.

For Cliff it was a fun little experience in its way, but also a frustrating one. It’s safe to say that it didn’t ignite any dormant passion for computers or computer games. He chafed constantly at the limitations of the two-word parser and the primitive world model. It often seemed that nine out of ten ideas he proposed were greeted by Pinero with a “Sorry, can’t do that,” followed by some esoteric technical reasoning he didn’t really understand and didn’t really care to. Pinero’s Apple II itself remained to him an incomprehensible and irascible black (or rather beige) box, all strident squawks and ugly colors. He was, needless to say, completely baffled by Pinero’s efforts to program the thing. If anything, the experience only confirmed his dislike of computers. He certainly didn’t rush out to buy one himself. He and Pinero did discuss doing another game together, but Pinero in particular was feeling completely burnt-out by all the work that had gone into the first — far more work than he had ever imagined it would be.

Cliff’s opinion of computers didn’t change until one day in late 1984 when he idly wandered into a store selling the Apple Macintosh and promptly fell in love. Ironically, he had long since shot one of his corporate films inside Xerox’s famed Palo Alto Research Center, the very place where most of the ideas behind the Macintosh were invented. For whatever reason, that experience had left little impression on him, done nothing to alter his opinion of computers as balky, unpleasant contraptions. The Macintosh itself, however, just did it for him right away. He loved its friendly demeanor, loved the simplicity of its point-and-click operating system, loved the elegance of its crisp black-and-white display in contrast to the ugly blotches of pixelated color he remembered on the screen of Pinero’s Apple II. He became just another of the thousands of creative souls, many of them far removed from your typical computer nerd, who saw magic possibility in “the computer for the rest of us.” He simply had to have one. A few thousand dollars later, he was armed with a shiny new 512 K “Fat Mac” with all the bells and whistles, purchased with vague justifications that it would be useful for his business.

Cliff was perhaps unusually receptive to the idea of a life-changing event at about this point. His work as a filmmaker was more stultifying than ever. Even a chance to do animations for a brief-lived children’s television series called Out of Control, which was broadcast by the Nickelodeon cable channel as their first original series ever, hadn’t lifted his malaise. So, yes, he was looking for a way out even before he wandered into that store. Soon his new Macintosh would provide it.

He first programmed his Macintosh by writing macros for keeping track of his business finances in the Microsoft Multiplan spreadsheet. His programming began in earnest, however, only when a friend of his, knowing he was very enamored with his new computer, gifted him with a copy of Microsoft BASIC for the machine. It was, surprisingly for this inveterate computer hater, not quite Cliff’s first exposure to the language. The only computerized gadget he had ever owned prior to purchasing his Macintosh had been an Atari VCS game console, for which he had received, again as a gift, Atari’s “BASIC Programming” cartridge. Delivered as a hedge to fretful parents thinking of replacing Junior’s game console with a real home computer, it was a shockingly primitive creation even by the standards of its day. Programs, which had to be laboriously entered using Atari’s infuriating “keyboard controllers,” could be all of 63 characters in length, and couldn’t be saved. But despite it all, Cliff had found the experience of programming vaguely interesting, enough to devote an afternoon or two to, as he puts it, “getting a tiny square to move around the screen.” Now, with this latest gift of a BASIC, those memories came back, and he started learning to program his Macintosh with real enthusiasm. The natural question then became what to do with his burgeoning skills.

Cliff happened to be acquainted with Philip Proctor and Peter Bergman of the legendary comedy troupe The Firesign Theatre. For a time, they all discussed bringing Firesign’s most famous character, the hard-boiled detective Nick Danger, to interactive life via some sort of adventure game. Yet that puzzle-filled storybook that Cliff had made several years before, the one that had left him feeling like it was a genuinely great idea that just hadn’t found the right audience, kept popping into his head. People who played on computers at that time — yes, even “computers for the rest of us” like the Macintosh — tended to be the sort of people who noticed the little things, who were intrigued by them. What might they make of a computerized puzzle book? Married by this point, he told his wife Kathy one day that he had to drop the filmmaking business, had to drop everything and find out. With Kathy still attending university, they would just have to live on savings and credit cards while he saw it through. Thus was The Fool’s Errand reborn as a computer game.

Let’s be clear: it was a crazy thing to do. Having programmed for a bare handful of months, Cliff proposed programming a commercial-quality game. Having never seriously played a computer game in his life, he proposed designing one. Knowing no one in and nothing about the games industry, he proposed selling his creation at some point to a publisher. He didn’t even like to solve puzzles, not really. His consolation, if he had only known it, might have been that he was mirroring to an uncanny degree Kit Williams, the man who had set him down this path in the first place. Kit also had never evinced the slightest interest in actually solving puzzles, had conceived the grand puzzle that was Masquerade strictly as a gimmick to get people to really look at his artwork and — let’s be honest here — to sell books.

Cliff started, as you’d expect, with his old storybook itself. His original story of the Fool’s wanderings through a tarot-inspired fairy-tale land went into the computer version almost verbatim. The patchwork treasure map also went in, consisting of the same 81 tiles, each linked to a section of the story; it would be much easier to unscramble on the monitor screen, requiring only mouse clicks rather than scissors and glue. And the crossword full of treasures, your reward for completing the map, remained as the final puzzle. But fleshing out this spine, often called today the “meta-puzzle” of the game, would be a collection of other, smaller puzzles that were new to the computer version. Entering each treasure in the final puzzle, for instance, would require not just that you figure out what that treasure should be from the story but that you solve another set-piece puzzle as well. And most of the story itself would be hidden from you at the beginning; opening up the other sections for reading would require, you guessed it, solving puzzles.

A friend of Cliff’s used to have a subscription to Games magazine, and would loan him each issue after he was finished with it, by which time all of the puzzles were marked up with his solutions. Cliff didn’t care. He wasn’t so interested in solving the puzzles, which took time he didn’t feel he could spare anyway, as he was in looking at how they were put together, enjoying them almost as one might a work of art. Although he didn’t realize it at the time, he was already thinking like a game designer. Now that subconscious preparation was about to serve him well.

The individual puzzles he crafted for his game are multifarious, many of them old favorites of the sort that he had studied in those magazines: word searches, anagrams, cryptograms, crosswords, mazes, jigsaws. Others, however, are delightfully original, like the tarot-inspired card game that requires you to first figure out what the rules are before you can concentrate on actually winning the thing. Some, like the word-concatenation puzzles, are almost naive outgrowths of Cliff’s early experiments with BASIC. A few, the least satisfying in my opinion, are essentially action games, dependent as much on reflexes as smarts.

Through the early months, Cliff was writing each puzzle as its own self-contained BASIC program, unclear exactly how he would tie them together to create a unified experience. Most of his problems came down to Microsoft BASIC itself. Because it was interpreted rather than compiled, it was painfully slow. Even worse, its programs required that the end user also have Microsoft BASIC in order to run them. In short, it was an environment for casual hobbyists and students of programming, not for the creation of a full-fledged commercial game. About a year after he’d first bought his Macintosh, a life-saver arrived in the form of ZBASIC, a compiled version of the language whose programs could run on any Macintosh, and at several times the speed of the Microsoft version at that. There were unfortunately enough syntactical differences between the two dialects that Cliff had to spend quite some time porting his code, but he ended up with a much more powerful and flexible language that was up to the task of implementing The Fool’s Errand.

Very much the amateur, self-taught programmer, Cliff’s code was, as a few technical friends told him at the time and as he freely admits today, neither terribly efficient nor terribly well-structured. Yet it had a couple of overriding virtues: it worked, and Cliff at least understood how it worked. Throughout the development of The Fool’s Errand, he constantly shared his puzzles in progress with his wife, with his old friend and Labyrinth of Crete partner Allen Pinero, and with another old friend, David Wood. Still, Cliff remained haunted by a “morbid pessimism” that at some point the whole house of cards, built from dozens of little BASIC programs all piled atop and beside one another, would collapse into hopeless chaos.

But it didn’t happen, and by the end of 1986 he had something complete enough to start shopping to publishers. Cliff still knew next to nothing about the games industry, but once more that old audacity, that willingness to just call his supposed betters and ask for whatever it was he wanted, served him well. A few publishers showed serious interest, despite the fact that the Macintosh market was still quite a minor one when it came to games. He met with Activision, publisher of what remains to this day the only computer game to have ever really captured Cliff’s interest as a player, the casual Mahjong puzzler Shanghai. They were quite willing to sign him, but the royalty they offered seemed rather paltry and, even worse, they insisted that he sign over to them his copyright. If there was one thing Cliff’s years in and around Hollywood had taught him you should never do, it was that. So he ended up signing instead with a young Macintosh-centric publisher called Miles Computing. Tiny though they were by the standards of the industry at large, they had already made a decent name for themselves in Macintosh circles with games like Harrier Strike Mission, which as the platform’s first available flight simulator had done very well for itself, and a line of clip-art disks for desktop publishers. They offered a much better royalty than Activision, were willing to let him keep his copyright, and were based right there in Southern California.

What Miles wasn’t terribly good at doing, Cliff soon learned to his dismay, was actually selling software that didn’t spontaneously sell itself. Released at last in April of 1987 with absolutely no promotion, The Fool’s Errand sank without a trace. One or two reviews buried deep inside magazines, lukewarm and noncommittal, became the full extent of its press presence. Cliff was left in an uncomfortable limbo, unsure what to do with himself next. His savings were exhausted, his credit-card debt was now approaching $50,000, and his royalties were so minimal as to be virtually nonexistent. He wasn’t eager to return to his old business of filmmaker-for-hire, and wasn’t sure he could anyway; once you fall out of people’s Rolodexes in Hollywood it’s damnably hard to get yourself back in. But, based on the evidence so far, this computer-game thing wasn’t exactly proving to be a financial winner either. The name of his game was now seeming sadly apropos. Making The Fool’s Errand, it seemed, had itself been a fool’s errand.

Cliff Johnson, 1987

The game’s life preserver, and thus Cliff’s as well, came in the form of a superlative feature review (“5 Mice!”) written by the prominent Macintosh pundit and columnist Neil Shapiro for the July 1988 issue of MacUser. Shapiro was the first reviewer to take the time to properly dig into the game, to understand what it was and what it was doing. He liked what he saw. In fact, he really liked what he saw. “Cliff Johnson has taken computer gaming, turned it inside-out and upside-down, and redefined the state of the art,” he wrote. He continued to champion the game relentlessly over the months that followed. The floodgates opened; The Fool’s Errand became a hit. A suddenly much more enthusiastic Miles Computing belatedly funded ports to the Commodore Amiga, the Atari ST, and MS-DOS. Cliff, who had nothing to do with programming those versions, was never happy with the way they looked or played. He considers them “Bizarro World” versions of his game, ugly, simplified, and buggy to boot. It was a long, not entirely successful struggle for him just to keep the worst of the porters’ razzle-dazzle videogame flourishes out of the end results. Still, in combination they sold far more copies than the Macintosh original. Total sales of The Fool’s Errand reached 100,000 copies by the end of 1989, perhaps not quite a blockbuster by the standards of the big boys but by far the biggest hit that little Miles Computing had ever enjoyed. Certainly it was more than enough to let Cliff pay off his credit cards and remain a game developer. We’ll be continuing to follow him in his new career in future articles.

For now, though, let’s talk about The Fool’s Errand itself just a little bit more. It’s one of those singular works that defies (transcends?) the conventional wisdom — including plenty of the wisdom that I spout routinely right here on this blog. Having chided people from Scott Adams to Ken Williams for refusing to engage with the games made by others outside their companies, I must admit that Cliff Johnson didn’t know a thing about other computer games at the time he wrote The Fool’s Errand, and never bothered to learn — and yet his game turned out brilliantly. Having harped endlessly on the importance of testing and player feedback, I must admit that The Fool’s Errand was seriously played by just three people not named Cliff Johnson prior to its release — and yet, again, his game turned out superbly, and about as bug-free as a game can be to boot. What is there to say, other than don’t try this at home, kids?

In his Mac User review, Neil Shapiro rather brilliantly described The Fool’s Errand as a “whole buffalo” game. Everything you see on the screen is important, nothing extraneous or inexplicable. When you first start the game, it’s both underwhelming and a little baffling, nothing more than a handful of puzzles — most of them fairly trivial — and a few scattered fragments of a story that doesn’t make much sense. And so you shrug your shoulders and have a go at one of the puzzles, maybe starting with something simple like the word search for names of countries. Slowly you begin to peel back layer after layer of the onion. Are certain words in the story printed in boldface? It’s not just for aesthetic effect; there’s a reason for it that will become clear in time. Have no idea what to do with this scrambled map? Work on other, simpler problems for a while and insight might just come. Finished all of the puzzles from the initial menus and think you’re about to win? You’re about halfway actually, with the tastiest parts of the onion still waiting for you. I can’t emphasize enough what a wonderfully intriguing experience solving The Fool’s Errand becomes. My wife Dorte and I played it together, as we do many of the games I write about here, and enjoyed one of the most captivating gaming experiences we’ve ever shared. (I suspect that Dorte, a puzzle addict who’s much better at most kinds of them than I am, would delete the “one of” from that sentence.)

Chatting with me about The Fool’s Errand, Cliff was at pains to emphasize how incredible it is to him that people today, almost thirty years later, continue to enjoy his first game. Like most designers at the time, he wasn’t thinking beyond the next year or so, and certainly gave no thought whatsoever to The Fool’s Errand as a work for posterity. Yet it feels every bit as contemporary and compelling today as it must have then, the very definition of a timeless work. I think we can ascribe that timelessness to a number of things. Far more than just a collection of puzzles, there’s a beauty about this design, its many strands looping over and entwining one another like a Bach fugue: the text with its simple diction of Myth; the pictures, which are so lovely and evocative that black-and-white seems an aesthetic choice here, not a limitation of the hardware; the intricately fashioned meta-puzzle itself, leading to that Eureka! moment when it all comes together. Perhaps most of all, there remains a generosity of spirit about The Fool’s Errand that bleeds through the screen. As Cliff has stated many times, his goal is never to absolutely stymie you, to prove that he’s the cleverer by presenting you with impossible dilemmas. He wants to tempt and entice and, yes, to challenge you — what fun would The Fool’s Errand be otherwise? — but ultimately he wants you to succeed, to peel back the onion and to share in The Fool’s Errand‘s mysteries. There’s no nonsense in the game; it always plays fair. Take your time with it, and it will reward you in spades.

So, I think you should play this game if you haven’t already. If you enjoy the sorts of games I usually feature on this blog, I think this one will blow you away. The state of classic Macintosh emulation in general being a disgraceful mess for such an historically important platform, I want to do all I can to make that as easy as possible for you. I’ve therefore made a zip that contains the most user-friendly of the early Macintosh emulators, Mini vMac, in versions for Windows, (modern) Macintosh, and Linux. The zip also includes the ROM file that Mini vMac needs to run (please, nobody tell Apple!), the disk images for the game along with a formatted save disk, the original instruction manual, and some brief instructions I’ve written to get you going with the whole package. Give it a shot. If I’ve done my part properly, it won’t be hard at all, and I think you’ll be glad you did. This one is touched, folks.

(Sources: This article is mostly drawn from a long interview I conducted with Cliff himself. Much other information about his life, career, and games can be found on his personal website, although, in keeping with The Fool’s Errand itself, you sometimes have to dig a bit in unexpected places for it.

If you play and enjoy The Fool’s Errand, be sure to check out The Fool and His Money, the long-awaited 2012 sequel that Cliff describes as “everything The Fool’s Errand is times ten.” Dorte and I haven’t had the pleasure yet ourselves, but, believe me, we will just as soon as I can break free of all my moldy oldies for long enough.)

Footnotes

Footnotes
1 For better and often for worse, Masquerade‘s connection to computer gaming extends far beyond the story of Cliff Johnson. The man who first “solved” the riddle and was awarded the hare in March of 1982, one Dugald Thompson, did so, it was revealed years later, largely by cheating. A friend of his had as his current girlfriend a former girlfriend of Kit Williams. While Kit had never outright told her where the treasure — a golden hare sculpted by Kit himself — lay, he had inadvertently dropped enough clues that she could point Thompson in the right direction.

After he was awarded the prize, Thompson formed a company called Haresoft to release an unspeakably horrible computer game called Hareraiser that supposedly contained further clues pointing to a new location of the golden hare. If it did, nobody ever worked them out. More likely, the game was yet another instance of fraud committed by Thompson, designed to make a quick buck from the players that it sent off on a wild goose chase after its nonexistent clues. It justly bombed, Haresoft went into liquidation, and Thompson was forced by his creditors to sell the golden hare at auction.

Long before those events, Masquerade had inspired other British game developers to embed real-world treasure hunts and other puzzles in their own games, perhaps most notably the hunt for the “Golden Sundial of Pi” buried in the Sussex Downs by Mel Croucher and Christian Penfold of Automata. All told, that golden hare had one hell of a long reach.

 
45 Comments

Posted by on November 20, 2015 in Digital Antiquaria, Interactive Fiction

 

Tags: ,

Macware

Macware

In the Macintosh software artists confronted that rarest of things, a completely new canvas. It wasn’t just a case of the Mac being better than the PCs that had come before; they’d had plenty of experience already dealing with that. No, the Mac was not so much better as fundamentally different. For all the possibilities opened up by the Mac’s mouse, its toolbox of GUI widgets accessible by any program, its crisp high-resolution screen, and its ability to make practical use of sampled sound recorded from the real world, there were also lots of caveats and restrictions. The black-and-white display and the lack of handy joysticks, not to mention the lack of obvious ways to get out of the windows-and-mouse paradigm, meant that many or most existing games would make little sense on the Mac. All Mac software, games included, would have to strike off in entirely new directions rather than building on the stuff that was already out there. That, of course, was very much how Steve Jobs and company had intended things to be on their paradigm-shifting creation. The original Mac team has mentioned almost to a person how excited they were at the launch to see what people would make with Macintosh, what they could do with this new set of tools. Game programmers were as eager as anyone to take up the challenge.

And some of them were to be found right there at Apple. Indeed, the Mac’s first great game far predates the launch. Like so much else on the Mac, it was born on the Lisa.

Through the Looking Glass, née Alice

Through the Looking Glass, née Alice

At some point in the middle stages of the Lisa’s long gestation, a programmer specializing in printer interfacing named Steve Capps started tinkering in his spare time with Alice, a game inspired by the chess motif running through Lewis Carroll’s Through the Looking Glass. The player moved a piece representing Alice in real time around a chess board which was laid out in a striking 3D perspective, trying to stomp on all of the opposing pieces before they stomped on her. It was a simple concept, but, what with the player being subject to the normal movement rules of whatever chess piece she chose to play as in the beginning, one admitting of surprising depth. None other than the Lisa team’s head of systems programming, Bruce Daniels, introduced the Mac people to Alice. With the affable Daniels acting as intermediary, Capps soon received a Mac prototype along with the Mac team’s heartfelt request that he port Alice to it, a request to which he quickly acceded. It made a better fit to the Lisa’s more playful younger brother anyway, and, thanks to the Mac’s 3 extra MHz of clock speed, even ran more smoothly.

Alice became an obsession of the Mac team, with marketer Joanna Hoffman a particular devotee. She complained constantly that the game was too easy, prompting the obliging Capps to tweak it to increase the challenge. As Capps himself has since acknowledged, this probably wasn’t all to the good; the game that would eventually see commercial release is extremely challenging. Other suggestions, like the one from Steve Wozniak that the mouse cursor should shrink as it moved “deeper” into the board to emphasize the 3D perspective, were perhaps more productive. Steve Jobs showed little interest in the game itself (one of the many constants running through his career is an almost complete disinterest in games), but was very intrigued by the programming talent it demonstrated. Alice became Capps’s ticket to the Mac team in January of 1983, where he did stellar work on the Finder and other critical parts of the first version of MacOS.

As the big launch approached, Capps was understandably eager to explore the commercial potential of this game that had entranced so many of his colleagues. Trip Hawkins, who had continued to stay in touch with goings-on inside Apple even after resigning from the Lisa team, was sniffing around with proposals to release Alice under the Electronic Arts aegis, with whose accessible-but-arty early lineup it would have made an excellent fit. Steve Jobs, however, had other ideas. Feeling that the game should come out under Apple’s own imprint, he delivered a classically Jobsian carrot — that Apple would do an excellent job packaging and promoting the game — and stick — that, since Alice had been created by an Apple employee on the Apple campus using prototype Apple hardware and proprietary Apple software, it was far from clear that the game belonged to said employee in the first place, and legal trouble might just be the result if Capps decided to assume it did. And so Capps agreed to allow his game to become the first and only such that Apple themselves would ever release for the Mac.

The Through the Looking Glass package

The Through the Looking Glass package

The discovery of a database application already trading under the name of “Alice” necessitated a name change to the less satisfactory Through the Looking Glass. But otherwise Apple’s packaging of the game, made to look like an original edition of the novel that had inspired it — albeit one sporting a hidden Dead Kennedys logo, a tribute to Capps’s favorite band — was beautiful and perfect. EA couldn’t have done any better.

The marketing, though, was another story. Through the Looking Glass became a victim of Apple’s determination in the wake of the Lisa’s failure to reposition the Mac as their serious business computer, to shove the fun aspects of the machine under the carpet as something shameful and dangerous. Thus Capps’s game got nary a mention in Apple’s voluminous advertising that first year, and mostly languished as a dusty curiosity on dealers’ shelves. The game has gone on to become something of a cult classic as well as a treasured piece of Macintosh lore, but Trip Hawkins would doubtless have done a much better job of actually selling the thing.

Others also had good reason to be frustrated with Apple’s fear of fun. Infocom received a visit from Guy Kawasaki, today the most famous of all Apple’s early “Mac evangelists,” well before the Mac’s launch. In the words of Dan Horn, head of Infocom’s Micro Group, Kawasaki “begged” Infocom to get their games onto the Mac, and delivered several prototypes to make it happen. It turned out to be unexpectedly challenging. The pre-release version of MacOS that Infocom received with the prototypes was so buggy that they finally decided to throw it out altogether. They wrote their own simple window and menu manager instead, packaging it onto self-booting disks that dumped the player straight into the game. When the Mac debuted, Infocom’s catalog of ten games represented something like 50% of the machine’s extant software base. But by now the winds of change had blown at Apple, and Infocom couldn’t get Kawasaki or anyone else to even return their phone calls. No matter; Mac early adopters were a more accepting lot than much of Apple’s executive wing. Infocom did quite well on the Macintosh, especially in those earliest days when, Through the Looking Glass and a bare few others excepted, their games were the only ones in town.

Ultima III on the Mac

Ultima III on the Mac

Still, Infocom was hardly the only gaming veteran to test the Macintosh waters. Sierra and Origin Systems demonstrated how pointless it could be to try to force old paradigms into new via their ports of, respectively, Ultima II and III to the Mac. The latter is a particular lowlight, with Ultima‘s traditional alphabet soup of single-letter commands just jumbled into a couple of long menus helpfully labeled “A-M” and “N-Z.” Thankfully, most either did original work or took a lot more care to make their ports feel like native-born citizens of the Macintosh.

Sargon III on the Mac

Sargon III on the Mac

Dan and Kathleen Spracklen, creators of the long-lived Sargon line of chess programs, ported the latest iteration Sargon III to the Mac complete with a new mouse-based interface and absolutely loads of learning aids and convenience features hanging from its menus. None other than Bill Atkinson, architect of QuickDraw and MacPaint, paused to note how the Mac version of Sargon III changed his very concept of what a chess program was, from an opponent to be cowed to something more positive and friendly, like the Mac itself.

I have to set Sargon III on the easy level. The challenge used to be seeing if the computer could beat you. The challenge now is for the computer to teach you, by leading you, giving you hints, letting you take back moves.

Bill Budge ported Pinball Construction Set, the program whose GUI interface presaged a largely Mac-inspired revolution in games when it appeared on the Apple II, to the Mac itself. As he himself noted, however, what was revolutionary on the Apple II was “just another program” on the Mac. Still, the Mac Pinball Construction Set did let you load your MacPaint pictures in as fodder for your custom pinball tables, a demonstration of one of the less immediately obvious parts of the new Mac Way: its emphasis on crafting applications that cooperate and complement rather than compete with one another.

Bill Atkinson's MacPaint

Bill Atkinson’s MacPaint

Bill Budge's MousePaint

Bill Budge’s MousePaint

Budge also went the other way, creating what amounted to an Apple II port of MacPaint called MousePaint that copied the original right down to the little Apple logo in the upper left of the menu bar. Packaged with Apple’s first mouse for the II line, MousePaint is one of the more obvious examples of the impact the Mac was already having on more modest platforms. (Budge also claimed to be working on a space simulation, but, like his vaunted Construction Set Construction Set and so much else during his years in the wilderness, it would never see the light of day.)

Much other early Mac entertainment also evinced the Pinball Construction Set approach of giving you ways to make your own fun, an ethos very much in keeping with that of the machine itself. MasterPieces, for instance, let you carve your MacPaint drawings up into jigsaw puzzles, while MacMatch let you use them to create matched-pair puzzles like the old game show Concentration. Still other programs weren’t technically games at all, but no less entertaining for it: things like Animation Toolkit; MusicWorks, which made the first spectacular use of the Mac’s four-voice sound capabilities; HumanForms, which let you make people, Mr. Potato Head-style, out of assorted body parts. Defender clones may have been in short supply on the Mac, but this heady, intellectual stripe of playfulness was everywhere by the time the machine entered its troubled second year. Thus Balance of Power felt like a perfect fit when it arrived that summer.

A magazine-published screenshot of the lost original Balance of Power

A magazine-published screenshot of the lost original Balance of Power

A creation of programmer, designer, writer, theorist, and industry gadfly Chris Crawford, Balance of Power is an ambitious geopolitical simulation of the contemporary world circa 1985. It places you in charge of either the United States or the Soviet Union, seeking to extend your sphere of influence over as many as possible of the sixty other countries in the game in a high-stakes game of Cold War brinksmanship. It’s a grandiose concept indeed, and becomes even more so when you consider the sheer amount of information Crawford has packed in — stuff such as number of physicians per million people, average daily caloric intake, and average school enrollment for each country. Not only would earlier machines have drowned under such a tsunami of data, but making it accessible and relatable would also have been nearly impossible. In Balance of Power, it’s all organized into neat menus and windows, as fine an example of the Mac’s ability to make information visually immediate and relevant as anything that came out those first couple of years. Before too long all grand strategy games would be like this.

Significant as it is as a waystation on the road to Civilization, Balance of Power is also a huge landmark of the serious-games movement. Simply put, this game has a rhetorical agenda. Boy, does it have an agenda. Pushing your opponent too far results in nuclear war, and the most famous piece of text Crawford has ever written.

You have ignited a nuclear war. And no, there is no animated display of a mushroom cloud with parts of bodies flying through the air. We do not reward failure.

It’s as powerful a statement now as then on not only the foolishness of jingoist brinksmanship but also on the seemingly perpetual adolescence of much of the mainstream games industry. Yet, and speaking here as someone who is quite sympathetic to Crawford’s agenda on both counts, it’s also kind of disingenuous and unfair and, well, just kind of cheap.

The problem here is that the game simply assumes bad faith on my part, that I’ve touched off a nuclear war so I can see body parts and mushroom clouds. In actuality, however, the body-parts-and-mushroom-clouds crowd is highly unlikely to have ever gotten this far with the cerebral exercise that is Balance of Power. It’s more likely that I’ve tried to play the game within the rules Crawford has given me and simply failed, simply pushed a bit too hard. It’s important to note here that playing within Crawford’s rules requires that I engage in brinksmanship; I can win only by pushing my luck, aggressively trying to spread my political agenda through as much of the world as possible at my fellow superpower’s expense so that I can end up with more “prestige points” than them. There is neither a reward nor any real mechanism for engendering détente and with it a safer world. Given that vacuum, I don’t really like being scolded for playing the game the only way that gives me any hope of success on the game’s own terms. To compound the problem, it’s often all but impossible to figure out how close your opponent actually is to the proverbial big red button, hard to know whether, say, Indonesia is really considered worth going to war over or not. Nuclear war, when it comes, can seem almost random, arising from a seemingly innocuous exchange after half a dozen computerized Cuban Missile Crises have passed harmlessly. There may arguably be a certain amount of rhetorical truth to that, but it hardly makes for a satisfying game. Perhaps more attention paid to presenting a real picture of the state of mind of your opponent and less to that mountain of 95% useless statistics could have helped — an ironic complaint to make about a game by Chris Crawford, coiner of the term “process intensity” and perpetual complainer about the prevalence of static data as opposed to interactive code in modern games.

I don’t want to belabor this too much more lest our real purpose here get entirely derailed, but will just note that Balance of Power falls into the trap of too many serious games to come as well as too many of Crawford’s own games in simply being not much fun to play. Crawford would doubtless simultaneously agree with and dismiss my complaints as a product of a body-parts-and-mushroom-clouds sensibility while noting that he aspires to something higher than mere fun. Which is fair enough, but I tend to feel that for a game to achieve any other rhetorical goal it must be engrossing in a way that Balance of Power just isn’t. Anyway, everything of ultimate note that it has to tell us about geopolitics is contained in the quote above. If like in the movie War Games the only way to win is not to play, why charge people $50 for the non-experience? Suffice to say that, like plenty of other works I’ve written about on this blog, Balance of Power garners historical importance and even a certain nobility simply for existing when it did and trying the things it did.

I want to end this little survey today with a less rarefied game that’s of at least equal historical importance. It’s the product of a small Chicago-area company called ICOM Simulations which had already been kicking around the industry for a few years under the name of TMQ Software. Formed by Tod Zipnick in 1981, TMQ’s most ambitious pre-Mac product had been File-Fax, a database manager for the Apple II that garnered a positive review or two but few sales. Other than that, they’d mostly specialized in doing action-game ports to various platforms for the likes of Atarisoft, Coleco, and even EA. When the Mac arrived, they figured their odds of making a splash with original games in that new ecosystem were far better than they were on the congested likes of the Apple II.

Déjà Vu. Note the multiple layers of containment.

Déjà Vu. Note the multiple layers of containment.

ICOM’s big idea was to translate the traditional parser-driven adventure game into the new visual paradigm of the Mac. The goal was essentially to do for the adventure what MacOS had done for the command-line-driven operating systems that preceded it, in pretty much exactly the same ways. The underlying world model of the MacVenture engine is that of a text adventure, divided into discrete interconnected rooms which can contain other objects with their own unique properties, including the one representing you the player. In a MacVenture, however, you interact with objects not by typing sentences but by constructing them visually, tapping one of eight verbs and an object to go with it — whether something in the room that you see represented graphically before you, something in your inventory (also represented as a set of draggable pictographs), an exit, or just your “Self.” You can add an indirect object by “OPERATING” one object (first click) on another (second click). You can pick up an object in the room just by dragging it to your inventory; drop it by dragging it back into the room. Objects can and often do contain other objects: you can “OPEN” the trench coat in your inventory to open a window showing you what’s in its pockets, “OPEN” the wallet you find there to open still another window with its contents, and so on down the hierarchy tree.

In the fall of 1985, when the first MacVenture debuted in the form of a two-fisted private-eye caper called Déjà Vu, it was an absolute stunner, the sort of thing that could stop people in their tracks when they stumbled across it running on an in-store computer. And it’s still a fine interface, very intuitive and, a few quibbles about clutter resulting from the small screens of its era aside, very practical and enjoyable today.

It’s all too typical in the industry for a game with the shiny technical innovations of Déjà Vu to coast on them, for the actual design inside the engine to be little more than a tech demo. Nor is ICOM’s pedigree as a collection of hardcore programmer’s programmers all that comforting. I thus didn’t expect to think too much of Déjà Vu as a game when I played it for this article. I must say, though, that ICOM surprised me there.

Déjà Vu begins on December 7, 1941(!), when you wake up in a bathroom stall inside a deserted bar with no memory of who you are or how you got there or who or what you emptied three shots from your revolver into or why you seem to have a veritable cocktail of drugs flowing through your veins. Yes, amnesia is a cliché premise in adventure games, not least because it’s so damn convenient for a genre that’s really good at exploration and backstory but usually not so good at here-and-now plotting. Yet it can also be a compelling premise, in mystery fiction as well as games, and it works here. The mystery of who you are and how you got to that bathroom stall is intriguing, its unraveling compelling, with complications like the dead mobster that you soon also find in the bar (with three of your bullets in him, naturally) coming thick and fast. In contrast to so many games of its era, Déjà Vu is also pretty solvable. Oh, it’s very old school, with an unforgiving time limit — the drugs in your system will eventually kill you if you can’t find the antidote — and the occasional random death. You’ll need to save early and often and plan your forays carefully. Yet if you’re willing to do that you’ll find you can probably crack the case pretty much unassisted, and have a pretty good time doing it.

Déjà Vu doesn’t take itself all that seriously, but it doesn’t treat its whole premise as just a breeding ground for jokes either. As a relatively coherent work of fiction, it stands amongst the top tier of 1980s adventure games. The jokes that are there mostly fit to the setting and are, shocker of shockers, genuinely funny as often as not. Much of the humor pokes fun at the protagonist, hardly unusual for early adventure games, but it doesn’t feel so personally insulting here because the game does a good enough job with characterization that you actually feel it to be sneering at the character you’re playing rather than you personally. About the only unfortunate aspect is an ugly series of juvenile jokes about an overweight woman, the sort of thing that can trigger a mild epiphany today about just how much certain social mores have changed — and, mind you, very much for the better — in the last thirty years.

Credit for Déjà Vu‘s surprisingly satisfying design largely goes to Craig Erickson. The story behind it was written by Kurt Nelson, Mark Waterman did the visuals, and Darin Adler, Steve Hays, and Todd Squires were the technical architects of the engine itself. Like Balance of Power, Déjà Vu was published by Mindscape, a company dating like EA from the big second wave of publishers and which, also like EA, was publishing some of the most interesting and audacious games in the industry during the mid-1980s. (That said, ICOM fell in with Mindscape largely out of convenience, because they were literally right down the road in an adjacent suburb of Chicago.) And also like Balance of Power, Déjà Vu was a hit by the modest standards of the early Macintosh software market, the big breakthrough that ICOM had been seeking for years. Tod Zipnick soon put his programmers to good use porting the MacVenture engine to other platforms, including not only the Mac’s mice-and-windows-and-68000-based competitors the Atari ST and Commodore Amiga but also the likes of the IBM PC, the Commodore 64, eventually even (in ports done by the Japanese company Kemco) the Nintendo Entertainment System — yet another sign of the importance of the Mac not just as a platform but as a concept and an engine of innovation.

ICOM has tended to be overlooked in histories of the graphic adventure, which mostly dwell on Sierra (whose King’s Quest debuted the year before Déjà Vu) and LucasArts (whose Maniac Mansion debuted two years after). In truth, however, the MacVenture engine is at least as important as Sierra’s AGI or LucasArts’s SCUMM engines. While King’s Quest is a deserved landmark simply for mixing interactive graphics with adventure at all, the AGI engine is also something of an evolutionary dead end with some fairly intractable problems, most notably that of trying to translate the objects you see graphically on the screen into words the parser will understand. LucasArts’s innovations, meanwhile, are more formal than technical, a declaration that it is possible to write challenging, enjoyable graphic adventures without random deaths, unforeseeable dead ends, and incomprehensible puzzles. The actual interface mechanics of the early LucasArts games are essentially a hybrid of AGI and MacVenture that is more playable than the former but not quite so slick as the latter. Déjà Vu gave its players in 1985 a preview of what virtually all commercial adventure games would be like in five or seven years. For a fan of prose and parsers like me and presumably many of you, that makes its debut something of a bittersweet moment, representing as it does one more huge nail in the slowly building coffin of the commercial text adventure. But such is progress.

Three more MacVenture games followed Déjà Vu, one of them a direct sequel. We’ll revisit ICOM at some future date to talk more about them, as we also will the ongoing cottage industry that was Mac software in general. In the meantime, you can play Déjà Vu and all of the other MacVentures online courtesy of Sean Kasun.

Their days may be numbered, but there’s still plenty to be written about the prose-and-parser people as well. We’ll take up that thread again next time, when we start to look at yet another of Infocom’s would-be challengers.

(Significant magazine sources: Electronic Games of March 1985; Byte of March 1986; Family Computing of April 1986. Jason Scott’s interviews with Steve Meretzky and Dan Horn for Get Lamp were invaluable as always; thanks, Jason! See a retrospective by Tom Chick for another take on Balance of Power. The picture that opens this article was taken from the March 1985 Electronic Games, who I wish had lasted longer; in addition to great art that I love to steal, the magazine had an unusually thoughtful editorial voice.)

 
23 Comments

Posted by on February 28, 2014 in Digital Antiquaria, Interactive Fiction

 

Tags: , , ,

Macintosh

The Apple Macintosh followed a long and winding road to join Steve Jobs onstage in front of a cheering throng at De Anza College’s Flint Auditorium on January 24, 1984. It was never even a particular priority of its parent company until, all other options being exhausted, it suddenly had to be. But once it finally was let out of its bag it became, just as its father predicted, the computer that changed everything.

Jobs wasn’t even the first father the Mac knew. It had originally been conceived almost five years earlier by another dreamer, digital utopianist, and early Apple employee named Jef Raskin who believed he could save the world — or at least make it a better place — if he could just build the Dynabook.

The brain child of still another dreamer and visionary named Alan Kay, who first began to write and speak of it in the very early days of Xerox PARC, the Dynabook was more thought experiment than realistic proposal — a conception, an aspirational vision of what could one day be. Kay called it “a dynamic media for creative thought”:

Imagine having your own self-contained knowledge manipulator in a portable package the size and shape of an ordinary notebook. Suppose it had enough power to outrace your senses of sight and hearing, enough capacity to store for later retrieval thousands of page-equivalents of reference materials, poems, letters, recipes, records, drawings, animations, musical scores, waveforms, dynamic simulations, and anything else you would like to remember and change.

The Dynabook was a tall order in light of the realities of 1970s computer technology. Indeed, nothing that came remotely close would actually appear for another two decades at least. As Kay himself once put it, thinkers generally fall into two categories: the da Vincis who sketch away like mad and spin out a dozen impractical ideas before breakfast upon which later generations can build careers and obsessions; and the Michelangelos who tackle huge but ultimately practical projects and get them done. Kay was a da Vinci to the bone. The PARC researchers dubbed the less fanciful workstation they built to be their primary engine of innovation for the time being, the Alto, the “interim Dynabook.”

Michael Scott, Steve Jobs, Jef Raskin, Chris Espinosa, and Steve Wozniak circa 1977

Michael Scott, Steve Jobs, Jef Raskin, Chris Espinosa, and Steve Wozniak circa 1977

Much later in the decade, Raskin thought he might advance the cause a bit more with an interim Dynabook of his own. He thought even the much-loved Apple II was too complicated, too difficult and fiddly, too aesthetically unpleasant, too big to ever play an important role in anyone’s life who was more interested in what she could do with a computer than the computer as an end in itself. He therefore pitched to the executives at Apple his idea for a relatively cheap (about $1000) and portable computer that, far from being the hardware hacker’s playground that was the Apple II, would be a sealed, finished piece — the only one you had to buy to start expressing yourself digitally. Even all the software you’d need would come built right in. Believing that the standard industry practice of naming prototypes after women (as often as not the prettiest secretary in the office) was sexist, he decided to call his idea Macintosh, after his favorite type of (edible) apples, the McIntosh.

In many ways Raskin’s idea cut directly against the grain of Apple’s corporate strategy, which was to further penetrate the business market, in the short term via the Apple III and in the long via the Lisa; both projects were already underway, although the latter was in nothing like the form it would eventually assume. While Apple was trying to trade in their bellbottoms for three-piece suits, Raskin was still living the hippie dream of bringing power to the people. “If I wanted to work for a business company, I’d join IBM,” he told Apple’s president Mike Scott. Still, the company was booming and an IPO was already visible on the horizon. There was enough money and enough hippie utopianism still hanging about the place to let Raskin and a few others tinker with his project.

The Macintosh project during its first eighteen months rarely had a staff of more than four, and often less than that; Raskin had to fight for scraps. Sometimes that worked out just as well; a key acquisition was Burrell Smith, a talented hardware engineer he rescued from a job as a lowly service technician, testing and repairing Apple IIs that had come back to the company under warranty. Smith became the Mac’s hardware guru, a position he would continue to hold right up through the launch and some time beyond, giving him by far the longest tenure of any member of the original team. Given his price window, Smith couldn’t afford to design anything that would be much more powerful than the Apple II; the first prototype was built around an 8-bit Motorola 6809 no more powerful than the Apple II’s 6502, and had just 64 K of memory. It did, however, use a relatively high-resolution bitmapped display in lieu of the Apple II’s text. Although he was oddly unenamored with mice and windows, this part at least of the Xerox PARC gospel had reached Raskin loud and clear.

With Raskin himself often not seeming sure what he wanted and what was doable and many of his staff not seeming overly interested in buckling down to work on his schemes, the project languished through most of 1980. On one or two occasions it was actually cancelled, only to be revived in response to Raskin’s impassioned pleas. Yet practical progress was hard to see. Raskin mostly busied himself with The Book of Macintosh, a sort of aspirational bible hardly more practical than Kay’s original dream of the Dynabook. Then Steve Jobs read The Book of Macintosh and promptly came in and took his computer away from him.

Jobs was a huge headache for Michael Scott, Mike Markkula, and the rest of Apple’s senior leadership, who received memos almost daily complaining about his temper, his dismissive attitude toward the Apple II platform that was the only thing supporting the company, and his refusal to listen to reason when one of his sacred precepts was threatened. Jobs’s headstrong authoritarianism had been a big contributor to the debacle that was the Apple III launch. (Traditional wisdom, as well as an earlier version of this article, would have it that Jobs’s insistence that the Apple III ship without a cooling fan led directly to the hardware problems that left Apple IIIs dying on buyers’ desks by the thousands. It does, however, appear that this version of events is at least questionable; see the comments section for more about that. Be that as it may, everyone involved would agree that Jobs did an already muddled project no favors.) The Apple III never recovered, and would pass into history as Apple’s first flop. Now he was sowing the same chaos within the Lisa project, a computer the company simply couldn’t afford to let go the same way as the Apple III. Scott and Markkula forcibly removed him from Lisa in late 1980. They would have liked for him to just content himself with enjoying his post-IPO millions and accepting the occasional medal at the White House as a symbol of the American entrepreneurial spirit while they got on with actually running his company for him. They would have liked, in other words, for Jobs to be like Wozniak, who dipped in and out of the occasional engineering project but mostly was happy to spend his time organizing rock festivals and finishing his education and learning to fly an airplane and generally having all the good times he’d missed during a youth spent with his head buried in circuit boards. Jobs, alas, was not so pliable. He wanted an active role at what was after all still in some moral sense his company. Trouble was, every time he took an active role in anything at all anger and failure followed. Thus his forcible eviction from Lisa while it still looked salvageable. But at the same time Apple certainly couldn’t afford an ugly break with their founder and entrepreneurial golden boy. When a hurt Jobs started to lick his wounds from Lisa not through ugly public recriminations but by interesting himself in Raskin’s strictly small-time Macintosh project, the executives therefore took it as very good news. Let him tinker and meddle to his heart’s content with that little vanity project.

But Jobs’s interest was very bad news for one Jef Raskin. Never really technical himself, Jobs nevertheless knew very well how technical people thought. He innocently suggested to Burrell Smith that he might dump the plebeian old Motorola 6809 in favor of the sexy new 68000 that the Lisa people were using, and double the Mac’s memory to 128 K while he was at it. That was an offer no hardware hacker could resist. With Smith successfully subverted, it was just a matter of time. Raskin wrote furious memos to upper management about Jobs’s unauthorized takeover of his project, but they fell on predictably deaf ears. Instead, in early 1981 the takeover was made official. Jobs condescendingly offered Raskin the opportunity to stay with the Macintosh in the role of technical writer. Raskin, who by all indications had an ego almost as big as Jobs’s own, refused indignantly and walked out. He never forgave Jobs for co-opting his vision and stealing his project, remaining convinced until his death in 2005 that his Macintosh would have been better for Apple and better for the world than Jobs’s.

For all that the project had been in existence for over eighteen months already, there was very little really to Macintosh at the time of the takeover — just Raskin’s voluminous writings and some crude hardware based on an obsolete chip that resoundingly failed to live up to the visions expressed in The Book of Macintosh. Thus one could say that the real story of the Macintosh, the story of the machine that Jobs would finally unveil in January of 1984, begins here. Which is not to say that Jobs discarded Raskin’s vision entirely; he had after all been originally drawn to the project by the ideas inside The Book of Macintosh. Although the $1000 goal would be quietly dropped in fairly short order, the new machine should nevertheless be inexpensive at least in comparison to the Lisa, should stress elegance and simplicity and the needs of everyday non-computer people above all else. Jobs, however, shared none of Raskin’s skepticism about mice and menus. He had bought the GUI religion hook, line, and sinker, and intended the graphical user interface to be every bit as integral to the Macintosh as it was to the Lisa. Hell, if he could find a way to make it more so he’d do that too.

Much of the original Mac team: Bill Atkinson, Andy Hertzfeld, Chris Espinosa, George Crow, Joanna Hoffman, Burrell Smith, and Jerry Manock. Taking a leaf from Electronic Arts's playbook, Apple photographed them often in artful poses like this one during the Mac's initial promotional push.

Much of the original Mac team: Bill Atkinson, Andy Hertzfeld, Chris Espinosa, George Crow, Joanna Hoffman, Burrell Smith, and Jerry Manock. Taking a leaf from Electronic Arts’s playbook, Apple photographed them often in artful poses like this one during the Mac’s initial promotional push.

Still with pull within Apple the likes of which Raskin could only dream of, Jobs began assembling a group of stars to start over and make his version of Macintosh. Joining Smith the hardware guru were additional hardware engineer George Crow; programmers Andy Hertzfeld, Larry Kenyon, Chris Espinosa, Bruce Horn, Steve Capps, Bud Tribble, and Bill Atkinson; industrial designer Jerry Manock to shape the external look and feel of the machine; Susan Kare to shape the internal look and feel as designer of graphics, icons, and fonts; and Joanna Hoffman as writer, marketer, and the team’s face to the outside world, the first “Mac evangelist.” Jobs even briefly recruited Wozniak, but the latter found it hard to stay focused on the Mac, as he would just about every other project after his Apple II masterpieces, and soon wandered off again. Others would come and go, but the names listed above were the core of the team that would, just as Jobs so often promised them was inevitable, change the world.

Jobs deliberately fostered an “us against the world” mentality, with the world in this case apparently including the rest of Apple — particularly the much larger and more bureaucratic Lisa team. His dictum that “It’s better to be pirates than to join the Navy” shaped the Mac team’s conception of itself as a brilliant little band of rebels out to make a better world for everyone. They even took to flying a skull-and-crossbones flag outside their offices on the Apple campus. They were united by a sincere belief that the work they were doing mattered. “We all felt as though we had missed the civil-rights movement,” said one later. “We had missed Vietnam. What we had was Macintosh.” Their pranks and adventures have become computer-industry folklore (literally; Andy Hertzfeld’s longstanding website Folklore.org is full of them, and makes great reading).

Of course, one person’s genius at work is another’s self-entitled jerk. A joke was soon making the rounds at Apple:

How many Macintosh Division employees do you need to change a light bulb?

One. He holds the bulb up and lets the universe revolve around him.

Perhaps the people with the most justification for feeling aggrieved were those poor plodding pedants — in Jobs’s view, anyway — of the Lisa team. As Steve Capps would later put it, “A lot of people think we ripped off Xerox. But really we ripped off Lisa.”

To say that the Mac could not have existed without Lisa is in no way an overstatement. Mac was quite literally built on Lisa; for a long time the only way to program it was via one of the prototype Lisas installed in the team’s office. The Mac people watched everything the Lisa people did carefully, then reaped the fruit of whatever labor seemed useful to them. They happily digested the conclusions of the Lisa team’s exhaustive user testing of various designs and interfaces and built them into the Mac. They took Bill Atkinson’s QuickDraw, the core rendering layer at the base of the Lisa’s bitmapped display, for the Mac. Later, Jobs managed to take its programmer as well; in addition to QuickDraw, Atkinson became the author of the MacPaint application. Yes, Jobs proved surprisingly willing to borrow from the work of a team he dismissed as unimaginative plodders. The brilliance of the people involved is one answer to the question of how Macintosh was created by so few. Lisa, however, is another.

The Mac people regarded their leader with a combination of awe and bemused tolerance. It was team member Bud Tribble who coined perhaps the most famous of all descriptions for Jobs’s unique charisma, that of the “reality distortion field.” “In his presence,” noted Tribble, “reality is malleable. He can convince anyone of practically anything.” Tribble elaborated further on Jobs’s unique style:

Just because he tells you that something is awful or great, it doesn’t necessarily mean he’ll feel that way tomorrow. You have to low-pass filter his input. And then, he’s really funny about ideas. If you tell him a new idea, he’ll usually tell you that he thinks it’s stupid. But then, if he actually likes it, exactly one week later he’ll come back to you and propose your idea to you, as if he thought of it.

The aforementioned reality distortion field kept this sort of behavior from seeming as obnoxious as it would have from just about anyone else. Anyway, everyone was well aware that it was only because of Jobs’s patronage that the Mac project was tolerated at all at Apple. This little group of pirates, convinced that what they were doing was indeed (to choose another of Jobs’s catchphrases) “insanely great,” something that would change the world, knew that they owed the vision and the opportunity for Macintosh to Jobs. Atkinson later noted that “You only get one chance to change the world. Nothing else matters as much — you’ll have another chance to have vacations, have kids.” Most people, of course, don’t ever even get one chance. He and the rest of them owed theirs to Jobs.

Thankful as they were, they were hardly mindless disciples. They did their best to redirect his course when he got details as wrong as he got the big-picture vision right. When their reasoning failed, as it usually did with the imperious Jobs, they did their best to subvert him and/or to minimize the damage.

The list of bad decisions Jobs made about Macintosh is long, easily long enough to torpedo virtually any other computer. He insisted that the Mac use the same horrifically unreliable in-house-designed “Twiggy” disk drives as the Lisa, an example of borrowing a bit too much from Mac’s older sister. He rejected categorically pleas that the Mac at least have the option of memory expansion beyond 128 K, insisting that doing so would just encourage programming inefficiency and turn the Macintosh into a bloated monster like Lisa; his team’s arguments that a bitmapped, GUI-driven operating system running under a 16-bit processor required by its very nature vastly more memory than something like the Apple II got them nowhere. He rejected an internal hard drive because it would require that most hated of all pieces of technology, a noisy fan. He rejected a second internal floppy drive because there wouldn’t be room in Jerry Manock’s sleekly elegant case, plus bloat and all that. He tried to kill the Apple LaserWriter, a product that would prove almost as significant for the company as the Mac itself and without which the Mac may very well have not survived beyond its first couple of years. He cut short all discussion of networking by pulling out a floppy disk and pronouncing, “Here’s your network!” (The laser printer and Ethernet, those two other parts of the PARC gospel, had most resoundingly not reached Jobs during his famous visit.) He even refused to permit cursor keys on the keyboard, saying that the mouse was the only proper way to move the cursor in this new paradigm of computing.

The original Mac keyboard, complete with no cursor keys

The original Mac keyboard, complete with no cursor keys

People did what they could in the face of this litany. Burrell Smith made sure the Mac was capable of accommodating 3.5-inch floppy drives, the emerging industry standard soon to replace the older 5.25-inch floppies, as well as the Twiggy. When Lisa debuted a year ahead of the Mac and the Twiggy drives proved a disaster, the Mac manufacturing team was able to easily slot the 3.5-inch drives in in their place. (Taking the fall for Twiggy was another great service Lisa did Macintosh.) Everyone also made sure that the Mac was ready to accommodate more memory on both the hardware and software side, for when the realization finally dawned that 128 K just wasn’t going to cut it. (That realization began to dawn quite early even for Jobs; the machine he unveiled to press and public on January 24, 1984, had in fact been hacked to have 512 K. Otherwise the presentation would have been a less impressive one altogether, with a lot more time spent waiting for the Mac to deign to do something and none of the cool synthesized speech.) For most of the rest, there wasn’t much for it but to hope the machine did well enough with the early adopters that they could go back and fix the problems later. Cooler heads in management did at least prevail to save the LaserWriter.

On the hardware side, the Macintosh was smartly but minimalistically designed by Burrell Smith, a huge admirer of Steve Wozniak who strained to craft the same sort of elegant circuitry for the Mac that Woz had for the Apple II. For all that it was clean and compact, however, the Mac wasn’t terribly interesting or impressive as a piece of hardware. Jobs, from a contemporary interview in Byte magazine:

By paying a little more for the microprocessor, not only were we able to give the customer an infinitely more powerful chip than, say, an 8-bit chip or one of Intel’s baby micros, but we were able to pick up this amazing software [referring here to Bill Atkinson’s QuickDraw layer sourced from the Lisa project], and that allowed us to throw tons of chips out of this thing. We didn’t have to get special custom text or graphics chips. We just simplified the system down to where it’s just a bitmap on the screen, just Bill’s amazing software and Burrell’s amazing hardware, then in between that the other amazing software that we have. We tried to do this in every single way, with the disk and with the I/O…

The Macintosh, in other words, asks a heck of a lot of its 68000 CPU, something it could get away with because, well, it was a 68000, the most powerful reasonably priced chip in the industry of the time. A person reading that Byte interview might have asked what the 68000 could do with a bit more support in the hardware around it. That question would be answered in fairly resounding form by later 68000-based machines, most notably the Amiga, which could run rings around the Mac.

But of course that line of argument is a touch unfair, given that the Mac was the first PC in the world to be principally defined not by its hardware but by its software. The newly minted MacOS was a brilliant creation, one that went in many ways far beyond what its legendary predecessors at Xerox PARC had managed. Incredible as the Xerox Alto was, much of what what we have come to expect in our GUIs as a matter of course dates not from the Xerox of the 1970s but from the Apple of the early 1980s. Amongst these are such basic building blocks as pull-down menus and even the idea of windows as draggable entities that can overlap and be stacked atop one another; on the Alto they were non-overlapping tiles fixed in place (as they also were, incidentally, in the earliest versions of Microsoft Windows). One of Jobs’s favorite aphorisms during the final frantic year of Mac development was “Real Artists Ship!” This was something the tinkerers and theorists at PARC never quite managed to do. As anyone who’s ever finished a big creative project knows, the work of polishing and perfecting usually absorbs far more time and effort — and tedious, difficult effort at that — than hammering out the rough concept ever does. Apple did this heavy lifting, thus enshrining Xerox PARC as well as the Mac itself forever in computing legend. And they did it well — phenomenally well. I have my problems with Apple then and now, but this should never be forgotten.

As the Mac began to assume concrete form at the beginning of 1983, Jobs’s star at Apple was again in the ascendent. After years of muddled leadership from Michael Scott and Mike Markkula, the company had finally decided that a more dynamic leader was needed. Scott and Markkula had been Silicon Valley insiders steeped in engineering detail; Markkula had personally contributed code, testing, and documentation to the company’s early projects. To bring to fruition Jobs’s vision for Apple as a great mainstream company, known and loved by the masses, a very different sort of leader would be needed. Ideally, of course, that leader would be him, but Apple’s board wasn’t that crazy. As a second-best alternative, Jobs became enamored with a very unconventional choice indeed: a marketing expert and polished East Coast blue blood who was currently running the Pepsi brand. His name was John Sculley, and it was doubtful whether he even would know how to turn on one of Apple’s computers.

Steve Jobs and John Sculley at the Mac's public introduction on January 24, 1984.

Steve Jobs and John Sculley at the Mac’s public introduction on January 24, 1984.

Even had he never hooked up with Apple, Sculley’s name would be enshrined in business lore and MBA syllabi. Not yet 45 when Jobs’s courtship began, Sculley was already a decorated general of the Cola Wars. He had been one of the pioneers of what would come to be called “lifestyle advertising.” You know the sort of thing: all those advertisements that show cool, pretty people doing interesting things whilst listening to the hippest music and, oh, yes, just happening to enjoy a Pepsi while they’re about it. (“Come alive — you’re in the Pepsi Generation!”) “Boy,” thinks the consumer, “I’d like to be like those people.” And next time she’s at the grocery store, she picks up a six-pack of Pepsi. It sounds absurd, but, as one look at your television screen will tell you, it’s very, very effective. Very few of us are immune; I must sheepishly admit that I once bought a Volkswagen thanks largely to a certain advertisement featuring a certain Nick Drake song. As Mad Men has since taught all of us and Sculley grasped decades ago, the cleverest advertising doesn’t sell us a product; it sells us possibility. The best examples of the lifestyle form, like that Volkswagen spot, can be compelling and inspired and even beautiful.

If that wasn’t enough, Sculley was later instrumental to the most legendary Cola Wars campaign of all time, the Pepsi Challenge, which cleverly combined the lifestyle approach with the more conventional hard sell. The advertisements showed that it just happened to be the cool, attractive people — many of them hip young celebrities and athletes — who preferred the taste of Pepsi to that of Coke. The ads were everywhere, an inescapable part of the cultural landscape of the late 1970s and early 1980s. And, judging by the relative sales trends of Coke and Pepsi, they were very, very effective; for the root cause of the “New Coke” fiasco of the mid-1980s, look no further.

Now Jobs wanted Sculley to do the same thing for Apple, to craft for the company an identity that transcended the specifications sheets and price comparisons that sullied their competitors. To some extent Apple already enjoyed a special status; their compelling origin story and the charisma of their two young founders along with the engaging personality of their signature creation the Apple II gave them a cachet of which drabber, more conventional companies, executives, and computers could only dream. Now Jobs believed he and Sculley together could leverage that image to make an Apple computer the hippest lifestyle accessory of the 1980s. There was more than a little bit of utopian fervor to Jobs’s vision, part and parcel of that strange intermingling of hardheaded business ambition and counterculture idealism that has always seen Jobs and the company he founded selling a better world for a rather steep price. Jobs’s deal-closing pitch to Sculley, which may never have actually passed his lips in such pithy form, has nevertheless gone down into Apple lore: “Do you want to sell sugar water for the rest of your life, or do you want to come with me and change the world?” How could anyone refuse?

It became increasingly clear as 1983 wore on and Sculley settled into his new West Coast digs that the specific Apple computer that would be doing the world-changing must be the Macintosh. The Lisa was a flop, done in by intrinsic failings, like the unreliable Twiggy drives and its beautiful but molasses-slow GUI, and some extrinsic ones, like its high price and the uncertainty of big business — the only people who could realistically buy the thing — over what it really was good for. Nor did Jobs’s persistent whispers to reporters to just wait, that something cheaper and even better was coming soon, do the Lisa any favors.

Still, by many measures the Mac was not only cheaper but better than Lisa. Its 68000 architecture may have been unexceptional, but so was the Lisa’s — and the Mac’s 68000 was clocked at 8 MHz, a full 3 MHz faster than the Lisa’s. The Mac’s operating system was slim and lightweight, written in pure 68000 assembly language, as opposed to the Lisa’s bigger and more ambitious (overambitious?) operating system which was mostly written in Pascal. There was a price to be paid for the Mac’s slim efficiency; in some areas like multitasking and memory protection MacOS wouldn’t fully equal LisaOS until the arrival of OS X in 2001. But an average user just trying to get stuff done will make lots of compromises to have a snappy, usable interface — something which, at least in contrast to the Lisa, the Mac had in spades.

Condemned as a backwater project with little relevance to Apple’s business-centric corporate direction for years, as Macintosh geared up for the big launch Jobs and his band of pirates now found themselves taking center stage. Macintosh was now the future of Apple; Macintosh simply had to succeed. The last five years at Apple had been marked by the ever-greater success of the Apple II almost in spite of its parent company and two colossal and expensive failures to develop a viable successor to that beloved platform. Apple was still a major force in the PC industry, with yearly revenues approaching $1 billion. Yet they were also in a desperately precarious position, dependent as they still were on the archaic Apple II technology and their absurdly high profit margins on same. At some point people had to stop buying the Apple II, which was now thoroughly outclassed in some areas (notably graphics and sound) by competition like the Commodore 64 that cost a fraction of the price. With the Apple III and the Lisa lost causes, the Macintosh by default represented Apple’s last chance to field a viable bridge to the Apple II-less future that had to come one of these days. Given the age of the Apple II, it was highly doubtful whether they would have time to go back to the drawing board and create yet another new machine for yet another kick at the can. The Mac represented their third strike; it was Mac or bust. Steve Jobs and his team reveled in it and prepared to change the world.

The Macintosh was announced to the world on January 22, 1984. Early in the third quarter of Super Bowl XVIII and not long after one of IBM’s Charlie Chaplin spots for the ill-fated PCjr, an audience bloated with chips and beer and bored with a rather lackluster football game saw this, the most famous Super Bowl advertisement of all time.

Most people had no idea whatsoever what Apple was on about, had no idea that Big Brother represented the hated IBM who had taken the lead in business computing that Apple felt was rightfully theirs. The commercial was the talk of the media for the next few days, as everyone speculated about just what this “Macintosh” thing was and what it had to do with hammer-hurling freedom fighters. The advertisement, which it soon emerged had been directed by none other than that master of dystopia Ridley Scott of Alien and Blade Runner fame, would never darken a television screen again. No need; it had done its job, and would go down into history alongside Lyndon Johnson’s “Daisy” ad as one of the two most famous one-and-done commercials of all time.

The “1984” spot was an overheated, rather adolescent piece of rhetoric, coming off almost like a caricature of Apple’s exaggerated self-importance. It was by no means beloved by everyone even within Apple. The Mac’s moving up to become the company’s biggest priority hadn’t changed the determination of most of their executive wing to make it not as a maker of home and hobbyist computers, a competitor to Commodore and Atari and Radio Shack, but as a player in the much more lucrative field of business computing, where IBM (and, increasingly, IBM clones, a story for another time) ruled. Meanwhile Jobs still saw the Macintosh as he always had, as a way of changing not just the business world but the world full stop — which didn’t quite mean that he wanted to get down in the trenches with the likes of Commodore either, mind you, but also funneled his ambitions for the platform in a very different direction. Caught somewhere in the middle was John Sculley, a man who had been brought in thanks to his prowess as a consumer marketer but was nevertheless beholden to both factions. The constant push and pull between them, and the mixed messaging that resulted, would very nearly sink the Mac. Just before the Mac’s introduction, the business faction pushed through a rise in the list price from $2000 to a more businesslike $2500. But then came the “1984” commercial, whose lurid tone was all but guaranteed to repulse exactly the corporate leaders the business interests wanted to attract; these folks identified more with Big Brother than with the hammer-wielding freedom fighter. It would go on like that for a long time.

At the official launch on January 24, Jobs publicly committed Apple to the goal of selling 50,000 Macs in the first hundred days. It was dangerously ambitious; to miss the goal would be embarrassing and momentum-killing. In the end they managed it and then some; sales reached some 70,000, and they might have sold even more if not for teething problems at the factory typical of a new computer. Virtually all of the machines they sold, however, went not to corporations but to forward-thinking individuals of a certain technological bent and disposable income who rightly recognized in the Mac a new future paradigm. Douglas Adams, who saw his first Mac in Infocom’s offices and promptly fell in love, was archetypical of the demographic.

All of which was fine as far as it went — Apple was happy to sell to individuals too if they had the money to buy — but didn’t do a lot to further the dream of the Mac as a rival to the IBM PC on the desks of corporate America. Equally frustrating was much of the software that appeared that first year, which often tended toward games and other frivolous stuff frowned upon by corporations. By year’s end the early adopters with disposable income were already looking exhausted and corporations still weren’t buying. The result was tens of thousands of Macs piling up in warehouses and cancelled production orders. At year end total sales amounted to 250,000, about half of Jobs’s projections at launch time. And sales were getting worse every month, not better. It was beginning to look disconcertingly like Strike 3 — Apple III and Lisa all over again. The only thing keeping the company in the black was still the inexplicably evergreen Apple II, which in 1984, that supposed Year of the Macintosh, enjoyed its best sales yet. Revenue from the Apple II amounted to 2.5 times that from the Mac. Apple II loyalists, who despite Apple’s official claims of “Apple II Forever!” could see where the company’s real priorities lay, took no small delight in this reality.

Joanna Hoffman, the marketer who was with the Mac project almost from the beginning, frankly admitted later that the sales results were, at least in retrospect, unsurprising.

It’s a miracle that it sold anything at all. This was a computer with a single disk drive, no memory capacity, and almost no applications. People who bought it did so on seduction. It was not a rational buy. It was astonishing that Macintosh sold as many as it did.

Or, as Douglas Adams put it:

What I (and I think everybody else who bought the machine in the early days) fell in love with was not the machine itself, which was ridiculously slow and underpowered, but a romantic idea of the machine. And that romantic idea had to sustain me through the realities of actually working on the 128 K Mac€.

Those realities could be hellish. The single floppy drive combined with the inadequate memory could make the original Mac as excruciating to actually use as it was fun to wax poetic about, with the process of just copying a single disk requiring more than fifty disk swaps and twenty minutes. MacWrite, the Mac’s flagship version of that bedrock of business applications the word processor, was so starved for memory that you could only create a document of about eight pages. Determined Mac zealots swapped tips on how to chain files together to craft their Great American Novels, while the business world just shrugged and turned back to their ugly but functional WordStar screens. The Mac was a toy, at best an interesting curiosity; IBM was still the choice for real work.

"Test Drive" ad campaign

Sculley did his best to apply his Pepsi marketing genius to the Mac, but found it tough sledding. That Christmas Apple began the “Test Drive a Macintosh” campaign, which — shades of the Pepsi Challenge — let prospective buyers take a machine home for free to play with for 24 hours. Some 200,000 did so, but very few actually bought afterward, leaving stores with nothing but a bunch of used Macs to show for their trouble. For the 1985 Super Bowl, Apple attempted to recapture some of the Mac’s launch buzz with another high-concept commercial, this one depicting IBM users as mindless lemmings trudging off the side of a cliff. Ridley Scott’s brother Tony did the directing honors this time between pre-production work on Top Gun. But by now it all just felt kind of trite and childish, not to mention insulting to the very businesspeople Apple was trying to win over. Reaction from corporate America was so negative that Apple briefly considered taking out a full-page apology in the Wall Street Journal.

Apple’s summer of discontent, the rock-bottom point for the Mac, came in 1985. Not only were Mac sales still moribund, but by then another terrifying reality was becoming clear: Apple II sales were also slowing. The previous year had at last been the top of the bell curve. The day they had dreaded loomed, the day when they would have no viable next-generation machine and no faithful Apple II to fall back on. Apple closed three of their six factories and laid off 20 percent of their workforce, some 1450 people, that bleak summer.

Shortly after, Steve Jobs finally walked away from Apple following an acrimonious split with his erstwhile best mate John Sculley and a clumsy failed coup in the Apple boardroom. Jobs had proved psychologically incapable of accepting or addressing the Mac’s failings as both a piece of computer hardware and as a marketplace proposition. Jay Elliott, Apple’s head of human resources, summed up his situation beautifully:

[Jobs] could see that horizon out there, a thousand miles out. But he could never see the details of each little mile that had to be covered to get there. That was his genius and his downfall.

The Macintosh, like Apple itself, needed a practical repairman in 1985, not a bold visionary. This was a role Jobs was, at least at this phase of his life, eminently unqualified to play. And so he had made life intolerable for everyone, until the ugly public split that several generations of previous Apple management had only just found ways to avoid had come at last. The famed Apple mojo seemed all but gone, lost along with their charismatic founder.

But, as happens often (if not quite often enough) in business as in life, that summer proved to be the darkness before the dawn. Apple’s engineers had not been idle while the Mac struggled through its difficult first year, but had rather set doggedly to work to correct the worst of its failings. An external floppy drive became available a few months after launch, greatly alleviating the hell of disk swapping. The so-called “Fat Mac” with 512 K of memory, the amount most of the development team not named Jobs had agreed was appropriate from the start, appeared late in 1984. A hard disk and even cursor keys — their lack had been one of the more loathed aspects of the original machine if also a boon for makers of add-on keypads — were in the offing, as was, slowly and painfully, a workable networking system. The loss of Jobs only made such alleged dilutions of his vision easier to accomplish. The buggy original systems software was slowly tweaked and upgraded, while a third-party software ecosystem steadily grew on the backs of enthusiastic early adopters with money to spend. It didn’t come as quickly as Apple would have liked, and much of it wasn’t initially as businesslike as they might have liked, but the software — and with it a burgeoning community of famously loyal users — did come. Indeed, it was a third-party developer who arguably saved the Macintosh in tandem with another product of Apple’s busy engineering staff.

Paul Brainerd was a techie with a background in publishing who had for some time dreamed of finding a way to revolutionize the complicated and expensive process of traditional typesetting — pasteboards, huge industrial printers, and all the rest — through microcomputer technology. He had been stymied by two sore lacks: a computer with a high-resolution graphics display capable of showing what a document would look like on the printed page, pictures and all; and a printer capable of producing said document on paper. When he saw the Mac for the first time, he recognized that one of these needs had been met at last. When he reached out to Apple, they let him in on a secret: they had a solution for the other in the works as well, in the form of the LaserWriter, an affordable — in publishing terms; it would cost about $7000 — laser printer. The combination of the Mac, the LaserWriter, and the software Brainerd would eventually produce to make use of them, Aldus PageMaker, would invent the field of desktop publishing and change everything for the Mac and for Apple.

Like so much else about the Mac, it wasn’t an entirely original concept. Way back in circa 1975, Ginn & Co., a textbook publisher and Xerox subsidiary out of Boston, were gifted by the researchers at PARC with some Altos and a custom interface to hook them up to a big Dover laser printer. Ginn became the first all-digital publisher in the world. “Initially the reaction to the concept was, ‘You’re going to have to drag me kicking and screaming,'” said Tim Mott, one of the PARC people chiefly responsible for the project. “But everyone who sat in front of that system and used it, to a person, was a convert within an hour.” It was in fact Ginn’s editors who coined the ubiquitous terms “cut” and “paste,” a reference to the old manual process of cutting out manuscripts and photographs and pasting them onto pasteboard for typesetting. Now, a decade later, the rest of the world would finally get the opportunity to follow Ginn’s lead. The Mac had its killer app for business at last.

In retrospect it should have been obvious. It had been obvious to Xerox, hardly a company revered for vision; their big attempt to package PARC’s innovations into commercial form had come with the Xerox Star, a “document-processing workstation” that was essentially a sneak preview of desktop publishing before the term existed. But Apple, and especially Jobs, had been so focused on the Macintosh as a revolutionary force of nature in all aspects of the human condition that they’d had trouble thinking in terms of the concrete, practical applications that made corporations buy computers.

Publishers loved PageMaker. It turned what had been an all-night, all-hands-on-deck process, a hot, dirty nightmare of paste and print and paper for countless small periodicals and corporate publishing departments into something almost painless, something downright fun. Apple came to call PageMaker and its competitors, which were soon springing up like toadstools after a rain, their Trojan Horses. A brave purchasing manager would buy a couple of Macs and a LaserWriter as an experiment, and six months later the same company would be coming back for fifty or a hundred more. Publishing would become the first of several creative niche industries that the Mac would absolutely own, even as IBM continued to dominate the mainstream of business. It wasn’t quite the grand head-to-head challenge that Jobs had dreamed of, but, combined with sales of the Apple II that would remain on the descendent but surprisingly strong for the rest of the decade, it was a pretty good living.

Apple had been very, very lucky; they and the Mac had blundered through somehow. David Bunnell, longtime publisher of MacWorld magazine, summarized the Mac’s formative years bluntly:

To hold up the Macintosh experience as an example of how to create a great product, launch an industry, or spark a revolution is a cruel joke. Anyone who models their business startup on the Macintosh startup is doomed to failure. Miracles like the Macintosh can only happen once.

If the bargain with practicality represented by the Macintosh as desktop-publishing specialist seems disheartening, consider how genuinely empowering just this application was to countless people. For it wasn’t just big or medium-sized companies who bought Macs for this purpose. Especially as the prices of software and hardware came down, the small printers, the neighborhood associations, the church groups could also get in on the act. It’s astonishing how ugly the average fanzine or newsletter of 1980 is compared to that of 1995. The difference is almost entirely down to the Macintosh, which let people get their messages out there in a form of which no one need be embarrassed. Many, like a young man named Eliot Cohen who used his Mac to start a popular newsletter focusing on his obsession of New York Mets baseball and soon found himself in the locker room interviewing his heroes as the slick magazines called to beg for his insights, credited the Mac with literally changing their lives. This democratizing of the means of production is one of the most inspiring outcomes of the PC revolution and, much as I’m ambivalent about some aspects of the platform and its parent company, of the Mac itself. Indeed, I have a special reason for giving credit where it’s due: the logical successors to the Mac-enabled fanzines that were everywhere by the early 1990s are blogs like this one. We’re still riding that same continuum of change.

Consider also how immense was the Mac’s soft power. People — even people who rejected the Mac itself as an overpriced boondoggle — somehow recognized that this was the way computers really ought to work. It became an ideal, striven for if seldom reached for years. No matter; other computers were better for the striving. Even machines like the lowly Commodore 64 soon housed their own valiant attempts at replicating MacOS. To really get the scope of the changes wrought by the Mac, one need only compare the average Commodore 64 or Apple II game of, say, 1983 and 1986. A friendly GUI interface, of the sort which felt revolutionary when it appeared in the landmark Pinball Construction Set in 1983, was practically the baseline norm by 1986. The hardware hadn’t changed a whit; the vision of what could be done with it had. So, the Macintosh really did end up changing the world. Steve Jobs, wrong about so many nitpicky things, was breathtakingly right about that.

(The Macintosh story has been told so often and by so many that the biggest problem in writing an article like this one is sorting through it all and trying to inject some grounding into the more evangelistic accounts. My primary book sources were Insanely Great by Steven Levy; West of Eden by Frank Rose; Apple Confidential by Owen Linzmayer; and Dealers of Lightning by Michael A. Hiltzik. Andy Hertzfeld’s Folklore.org is also a goldmine. The Byte quote given above is from the February 1984 issue, part of a series of features greeting the Mac’s arrival. Various episodes of Computer Chronicles, archived by the dedicated folks at archive.org, also informed the article. See in particular “Mainframes to Minis to Micros”; “Integrated Software”; “Printers”; “Computer Ergonomics”; “The Macintosh Computer”; “Computer Graphics”; “Slowdown in the Silicon Valley” Parts One and Two; “Printers and Business Graphics”; and “Desktop Publishing” Parts One and Two. The photos sprinkled through the article are from Apple Confidential, except for the picture of the original Mac keyboard, which was taken from the aforementioned issue of Byte.)

 
36 Comments

Posted by on February 20, 2014 in Digital Antiquaria, Interactive Fiction

 

Tags: