RSS

Category Archives: Interactive Fiction

The Shareware Scene, Part 1: The Pioneers

The digital society which we’ve created over the last few decades has upended many of our traditional notions about commerce. Everyday teenagers now stress over their ratings and advertising revenues on YouTube; gamers in “free” games pay staggering sums for the privilege of advancing through them a little faster (wasn’t the actual playing supposed to be the point of a game?); “clicks” and “likes” have become commodities that are traded in the same way that soybean futures are in the “real” world; consumers have become speculators in their own future entertainment on crowd-funding platforms like Kickstarter; a writer like me can ask for support from readers like you to allow me to make content that I then give away for free. (Thank you for that!) And, in the most direct parallel to our main topic for today, even some of the biggest corporations on the planet have learned to give away their products for free, then ask us to pay for them later.

Some of these new modes of commerce reflect the best in us, some perhaps the very worst. They all share in common, however, the quality of being markedly different from the old model wherein you paid someone an upfront amount of money and got some concrete good or service in exchange. As those of you with elderly parents or grandparents may well have learned, our modern digital economies have departed so far from that model in some areas that just explaining how they work to someone still wedded to the old ways can be a daunting task indeed. (I know that my 86-year-old father has literally no idea what I do all day or how I can possibly be earning money from it…) Maybe we too should ask the question that so many of our elders are already asking themselves every day: exactly how did we get from there to here so quickly?

It’s a bigger question than any one article can possibly answer. Still, it does turn out that we can trace at least one point of origin of our strange new ways of commerce to a trio of American pioneers who, all within a year of one another, embraced a new model for selling software — a model which has, one might say, taken over the world.


Andrew Fluegelman

The first of our pioneers is one Andrew Fluegelman. Born in 1943, Fluegelman within his first 35 years of life finished law school, passed the Bar exam, took up and then gave up corporate law, and settled into a whole new career as the owner, editor, and sole employee of the Headlands Press, a boutique book publisher in Marin County, California. He worked from time to time with the techno-utopian visionary Stewart Brand on The Whole Earth Catalog, and even the books he edited and published on his own had much the same counter-cultural DIY flavor: The New Games Book (a selection of friendly outdoor sporting activities for groups of adults), How to Make and Sell Your Own Record, Worksteads: Living and Working in the Same Place. Yet for all their hippie bona fides, Headlands books went out under the larger imprint of the international publishing titan Doubleday. The ability to speak the language of both the idealistic dreamer and the everyday businessperson proved a vital asset for Fluegelman throughout his life.

Like Brand and so many others of a similar bent, Fluegelman saw great potential in the personal computer as a force for social liberation. Therefore in 1981, before ever actually purchasing a computer of his own, he signed a contract with Doubleday to embark on a new book project, this time with himself in the role of coauthor rather than just editor. It was to be an exploration of the role of computers in the writing process, in terms of both current practicalities and future potential. He would of course need to buy himself a computer to complete the project. Just as he was about to pull the trigger on an Apple II, the IBM PC was announced. “I took one look at it and just had this gut feeling,” he said in a later interview. “This is what I want.”

While he waited for the machine he had ordered to arrive, Fluegelman, who had never touched a computer before in his life, started teaching himself BASIC from books. Even after the computer came in, learning to word-process on it remained on the back burner for a time while he continued to pursue his new passion for programming. His bible was that touchstone of a generation of amateur programmers, David Ahl’s million-selling book BASIC Computer Games. Fluegelman:

I got Ahl’s [book], and I said, “This is just what I want to do.” I typed [one of the games] in. It took me a day to get the bugs out and get the thing to run. And as soon as I saw the program running, I immediately started thinking, “Well, gee, I’d really like to add up the scores, and say this, and make a little noise…” I’d look through the book, and I’d say, “Oh, there’s something I could use. What happens if I stick it in there?”

I’m a real believer in the Berlitz method of programming. Which is: you learn how to say, “Please pass the salt,” [then] you look in the dictionary and look up the word for “pepper,” stick it in there, and, by God, someone gives you the pepper. And you know you’re making progress. Purely trial and error.

I liked it a lot. I abandoned all bodily functions for about a month.

Programmers are born as much as made. You either feel the intrinsic joy of making a machine carry out your carefully stipulated will or you don’t; the rest is just details. Clearly Fluegelman felt the joy.

Still, the book project wouldn’t wait forever. Fluegelman and Jeremy Joan Hewes, his coauthor, had the idea that they would indeed write the book together, but with each working on his or her own machine from his or her own office. They would share their files electronically; it would be one more way of practicing what they intended to preach in the book proper, about the new methods of working that were unlocked by the computer. But Hewes had an older CP/M computer rather than a flashy new IBM PC, and this stopped them in their tracks — for the only telecommunications package currently available for the latter came from IBM themselves, and could only swap files using IBM’s proprietary protocols. Fluegelman thus found himself in the ironic position of being able to trade files with an IBM mainframe, but not with most of his peers in the world of personal computing. He could see only one solution:

[I] started out to write a communications program. I said, “Gee, I’d really like to do this, and I’d like to do that, and we should have a dialing directory, and we should have some macros…” And I just kept adding to it for my own use.

We eventually typeset the book using the program I wrote. In the process, I gave it to a lot of my friends, and they started using it. At the time it was the only program that let you do these things on the IBM PC; this was the early spring of 1982. And inevitably one of my friends said, “You know, you really ought to publish that.”

If I hadn’t been in the publishing business for eight years, I would have gone the traditional route — find a publisher, royalties — but I’d been through all that, and I’d seen the pitfalls and all the ways things can get derailed. And this was kind of a new medium, and I was still very exhilarated by it. And I said, having had all this fun, I just can’t go the same publishing route that I’ve gone before.

Throughout his life, Fluegelman had a special relationship with San Francisco’s Golden Gate Bridge. “I think it’s a power point,” he said once only semi-facetiously. “I have more inspirations driving across the Golden Gate Bridge…” One day shortly after finishing his program, he was driving across while thinking back to the pledge drive he had seen the night before on the local PBS television station.

My American readers will doubtless recognize the acronym, but, for the benefit of those of you in other places: PBS stands for “Public Broadcasting System.” It’s a network of over-the-air television stations which show children’s programs (most famously Sesame Street) as well as documentaries, news, and high-culture content such as symphony concerts and dramatizations of classic literature. Although the stations are free to watch, they are unlike other free stations in that they don’t sustain themselves with advertising. Instead they rely on a limited degree of taxpayer funding, but most of all on donations, in any amount and frequency, from viewers who appreciate their content and consider it worth supporting. In some ways, then, PBS can be called the great forefather of the many non-coercive digital-funding models of today. And indeed, the tale of Andrew Fluegelman makes the otherwise tangential thread that runs from PBS to so many modern Internet economies much more direct.

For, driving across his favorite bridge that day, Fluegelman had a PBS-inspired epiphany. He would market his little telecommunications package under the name of PC-Talk, using a method no one had ever dreamed of before.

I said, I’ll just set it out there, encourage people to use it. If they like it, I’ll ask them to send me some money. [He set the initial “suggested” donation at $25.]

So, I sent out the first version of the program that way. I put some notices on The Source and CompuServe: I’ve got this program, I wrote it, it’ll do this and this. It’s available for free, but if you like it, send me the money. And even if you don’t like it, still make copies for your friends because maybe they’ll like it and send some money.

The response was really overwhelming. I was getting money! I remember on the first day I got a check in the mail, and I just couldn’t believe it. I almost got driven out of business filling orders. At the time I was still producing books, and software programming was my own late-night thing. And suddenly I was standing there all day filling orders and licking stamps and sending things out, and I had to hire someone to start doing that. I was totally unprepared for it.

While I had written the program to work very well in my own situation, once you start sending software out into the world you start hearing about people with all sorts of crazy circumstances that you haven’t anticipated at all. I think if I had tried to publish this first version of the program [conventionally], people would have reacted very negatively. But they didn’t because I’d sent it out in this unrestricted way. So people would write back and say, “This is great, but why don’t you add this? Why don’t you try this?” In many cases people even helped me re-program to deal with their situations. And I ended up calling that “freeback” instead of “feedback” because it was really getting free support back from the community.

The usually savvy Fluegelman did make a couple of puzzling decisions during these early days. The first was to name his revolutionary scheme for software distribution “Freeware.” If you twist your synapses around just right, you can almost arrive at the sense he was trying to convey, but under any more straightforward reading the name becomes dangerously counter-intuitive. Thousands upon thousands of developers who came after Fluegelman would work desperately, but only partially successfully, to make people understand that their software wasn’t in fact “free” in the sense that using it regularly placed no ethical demand upon the user to financially compensate the creator.

Then, having coming up with such a flawed name, the lawyer in Fluegelman came to the fore: he went out and trademarked it. He imagined creating a proprietary “Freeware catalog,” collecting a lot of software that was marketed on the same model. Accordingly, he also included in his program’s liner notes a request for other programmers with useful software of their own to contact him, thereby to join him in a “unique marketing experiment.”

In the meanwhile, PC-Talk’s success was such that it quickly caught the attention of the business-computing mainstream. Already in August of 1982, the widely read InfoWorld magazine published an article on the subject, under the heading “CA man likens ‘Freeware’ to user-supported TV.” Fluegelman noted sensibly therein that, rather than fighting against the natural desire people had to make copies of their software and share them with their friends, Freeware leveraged it. He estimated that five copies of PC-Talk were made for every one that was downloaded directly from one of the commercial online services or sent out on disk by himself in response to a mailed request — and, unlike a conventional software publisher, he thought this ratio was just great.


Jim Knopf/Button

Our second pioneer was a far more experienced programmer than Fluegelman. Seattle-area resident Jim Knopf was only one year older than our first pioneer, but had already worked for IBM for many years as a systems analyst by the dawn of the microcomputer era. He built his first personal computer himself in 1978, then sold it to partially finance an Apple II. Among other things, he used that machine to keep track of the names and addresses of his church’s congregation. Knopf later wrote that “I liked what I produced so much [that] the program itself became a hobby — something I continued to work on and improve in my spare time.”

When the IBM PC was released in 1981, Knopf sold his Apple II and bought one of those instead. His first project on his new computer was to write a new version of his database program. As soon as said program was far enough along, Knopf started sharing it with his colleagues at IBM. They in turn shared it with their friends, and soon the database, which he called Easy File, went beyond his office, beyond Seattle, beyond Washington State. People encouraged him to upload it to the early online services; this he obligingly did, and it spread still faster.

Knopf was gratified by its popularity, but also bothered by it in a certain way. His database was still under active development; he was improving it virtually every week. But how to get these updates out to users? He included a note in the program asking users to “register” themselves so he could keep in touch with them; he maintained the resulting mailing list in Easy File itself. Yet keeping everyone up to date was prohibitively complicated and expensive in a world where most software was still passed around on floppy disks — a world where the idea of a program as a changing, improving entity rather than a static tool that just was what it was barely existed in the minds of most people. “How could I identify which of the users were serious ones – those that desired and required enhancements?” Knopf later wrote about his mindset at the time. “How could I afford to send mailings to notify them of the availability of improvements?”

So, in September of 1982, Knopf made a few moves which would define his future. First, he changed his own name for purposes of business. Worried that his Germanic surname would be too difficult for potential customers to pronounce and remember, he quite literally translated it into English. “Knopf,” you see, is the German word for the English “button” — and so Jim Knopf became Jim Button. (I’ll refer to him by the latter name from now on. Coincidentally, “Jim Knopf” is also the name of a character from a popular series of children’s books in Germany.) Next, he registered a company that referenced his new nom de plume: Buttonware. And, last but by no means least, he added a new note to his program. “I would ask those who received it to voluntarily send a modest donation to help defray my costs,” remembered Button later. “The message encouraged users to continue to use and share the program with others, and to send a $10 donation only if they wanted to be included in my mailing list.”

The very first person to contact Button in response told him that his approach was just the same as the one used by another program called PC-Talk. Button found himself a copy of PC-Talk, read its pitch to other programmers interested in joining the ranks of Freeware, and sent his own Easy File to Andrew Fluegelman. Fluegelman phoned Button excitedly on the same day that he received the package in the mail. The two of them hit it off right away.

While they waited for Fluegelman to find enough other quality software to make up his Freeware Catalog, the two agreed to form a preliminary marketing partnership. Button would rename his Easy File to PC-File and raise its price to $25 to create a kinship between the two products, and each program would promote the other, along with the Freeware trademark, in its liner notes. Button:

My wife said I was “a foolish old man” if I thought even one person would voluntarily send me money for the program. I was more optimistic. I suspected that enough voluntary payments would come to help pay for expansions to my personal-computer hobby – perhaps several hundred dollars. Maybe even a thousand dollars (in my wildest dreams!).

As it happened, he would have to learn to dream bigger. Like PC-Talk, PC-File turned into a roaring success.


The founding staff of PC World magazine. Andrew Fluegelman stands in the very back, slightly right of center.

Both programs owed much of their early success to the extracurricular efforts of the indefatigable Andrew Fluegelman. Shortly after releasing PC-Talk to such gratifying interest, Fluegelman had given the final manuscript of his word-processing book to Doubleday, who would soon publish it under the title Writing in the Computer Age. Still as smitten as ever by the potential of personal computing, he now embarked on his third career: he became a full-time computer journalist. He initially wrote and edited articles for PC Magazine, the first periodical dedicated to the IBM PC, but got his big break when he was asked to join the staff of a new rival known as PC World. Within a few issues, Fluegelman became editor-in-chief there.

Not coincidentally, the magazine lavished glowing coverage upon PC-Talk and PC-File. The latest version of Button’s program, for example, got a six-page feature review — as much space as might be devoted to a major business-software release from the likes of Microsoft or VisiCorp — in PC World‘s September 1983 issue. “What was previously a very desirable program is now just about mandatory for much of the PC population,” the review concluded. “If you use PC-File and don’t send Jim Button a check, the guilt will kill you. And it should.”

Button and his family were vacationing in Hawaii when the review appeared. Button:

The response was overwhelming. Our house sitter had to cart the mail home daily in grocery sacks.

When we arrived home, the grocery sacks were strewn all over the basement floor. We had to step over and around them just to get into our basement office. My son, John, worked days, evenings, and weekends just catching up on the mail. Life would never be the same for any of us!

Button would later date the beginning of Buttonware as a real business to these events. Nine months later, he quit his job with IBM, by which time he was making ten times as much from his “moonlighting” gig as from his day job.

Ironically, though, Button had already parted ways to some extent with Fluegelman by the time that life-changing review appeared. Fluegelman was finding it difficult to focus on his idea of starting a Freeware catalog, given that he was already spending his days running one of the biggest magazines in the computer industry and his evenings improving and supporting PC-Talk. Button:

Andrew got questions about my program and I got questions and requests about his. Checks were sent to the wrong place. The work required to correct all this grew exponentially. We had to make the separation.

Button came up with his own moniker for the distribution model he and Fluegelman had pioneered: “user-supported software.” That name was perhaps less actively misleading than “Freeware,” but still didn’t really get to the heart of the matter. Other names that were tried, such as “quasi-public domain,” were even worse. Luckily, the perfect moniker — one that would strike exactly the right note, and do it in just two syllables at that — was about to arrive along with Bob Wallace, the third principal in our little drama.


In this iconic picture of the early Microsoft, Bob Wallace is in the middle of the back row.

Like Jim Button, Bob Wallace was based in Seattle, and was a veteran of the kit era of personal computing. In fact, his experience with microcomputers stretched back even further than that of his counterpart: he had been the founder in 1976 of the Northwest Computer Society, one of the first hobbyist user groups in the country. Shortly thereafter, he was recruited from the computer store where he worked by Paul Allen, whereupon he became Microsoft’s ninth employee. In time, he became the leading force behind Microsoft’s implementation of the Pascal programming language. But, as an unreformed hippie whose social idealism paralleled his taste for psychedelic drugs, he found both Microsoft’s growing bureaucracy and its founders’ notoriously sharp-elbowed approach to business increasingly uncongenial as time went on. In March of 1983, he was for the first time refused permission to barge into Bill Gates’s office unannounced to argue some technical point or other, as had always been his wont. It was the last straw; he quit in a huff.

Taking note of Fluegelman and Button’s success, he wrote a word processor using his own Pascal implementation, and released it as PC-Write under the same payment model. To encourage its distribution, he added an extra incentive. He sent to any user who mailed in the suggested donation of $75 a special registration code, which she was then expected to enter into her copy of the program. When she gave this copy to others, it was thus tagged with its source. If any users of those copies sent in the fee, Wallace would send $25 to the user whose tag it bore; he later claimed that at least one person made $500 in these commissions. In its roundabout way, the scheme pioneered the idea of not just asking users for a donation out of the goodness of their hearts, but marking and altering the functionality of the software for those who sent in the payment, all through the use of the soon-to-be ubiquitous mechanism of the registration code.

But Wallace’s biggest contribution of all came in the form of a name. And therein lies a tale in itself.

Back in July of 1982, an InfoWorld magazine editor named Jay Lucas had started a column on “freeware” without being aware of Fluegelman’s counter-intuitive use of that term; Lucas took the word to mean any and all freely distributed software, whether the author asked for an eventual payment in return or not. The following spring, Fluegelman contacted the magazine to inform them of his trademark and ask them to cease and desist from violating it. So, Lucas launched a contest among his readers to come up with a new name. He reported in the InfoWorld dated May 30, 1983, that “at least a dozen” readers had sent in the same suggestion: “shareware.” He announced that he would be using this name henceforth. At the time, he still made no distinction between “free” software that came with financial strings attached and software that didn’t. He was, in other words, effectively using “shareware” as a synonym for all types of freely distributed software.

But when Bob Wallace saw the name, he knew that it was perfect for his distribution model: pithy, catchy, with all the right intimations. He contacted Lucas, who told him that he was free to use it; InfoWorld made no legal claim on the name. So, when PC-Write went out later that year, it described itself as “shareware.”

In early 1984, Softalk IBM, a brief-lived spinoff of a much-loved Apple II magazine, hired one Nelson Ford to write a regular column about “public-domain software.” Unsure what he should call the distribution model being used by each of Fluegelman, Button, and Wallace under a different name, he started off by employing the manifestly inadequate placeholder “quasi-public domain.” But in his May 1984 column, he announced a contest of his own: “A free disk of software and widespread publicity for the person sending in the best name for quasi-PD, contribution-suggested software. Since Andy won’t let anyone use ‘freeware,’ we’ll have to come up with another catchy name.”

He received such dubious suggestions as “conscience-wear” — “the longer you use the software, the more it wears on your conscience if you do not pay” — and “tryware.” But, just as Lucas had over at InfoWorld, Ford kept getting most of all the suggestion of “shareware.” Unaware of the name’s origin at InfoWorld, but well aware of its use by Wallace, he suspected that “shareware” would be as impossible for him to appropriate as “freeware.” Nevertheless, he inquired with Wallace — and was pleasantly surprised to be told that he was more than welcome to it. Ford announced the new name in the August 1984 issue of Softalk IBM.

It’s questionable whether the actual column in which he made the announcement was all that influential in the end, given that the issue in which it appeared was also the last one that Softalk IBM ever published. Still, Ford himself was a prominent figure online and in user-group circles. His use of the name going forward in those other contexts, combined with that of Jay Lucas in InfoWorld, probably had a real impact. Yet one has to suspect that it was PC-Write itself which truly spread the name hither and yon.

For, perhaps because a word processor, unlike a telecommunications program or a database, was a piece of software which absolutely every computer owner seemed to need, Wallace was even more successful with his first piece of shareware than the two peers who had beaten him onto the scene had been with theirs. The company he founded, which he called QuickSoft, would peak with annual sales of more than $2 million and more than 30 employees, while PC-Write itself would garner more than 45,000 registered users. Staying true to his ideals, Wallace would always refuse to turn it into a boxed commercial product with a price tag in the hundreds of dollars, something many conventional software publishers were soon pressuring him to do. “I’m out to make a living, not a killing,” he said.

Jim Button was less inclined to vocalize his ideals, but one senses that much the same sentiment guided him. Regardless, he too did very well for himself. Already by 1984, he was getting approximately $1000 worth of checks in the mail every day. While PC-File itself never garnered quite the popularity of PC-Write — about 7000 users registered their copies in the end — Button soon branched out well beyond that first effort. Buttonware would peak with annual sales of $4.5 million and 35 employees.

Those who jumped on the shareware bandwagon afterward would find it very difficult to overtake these two pioneers in terms of either income or market impact. As late as 1988, Compute! magazine judged that the two most impressive shareware products on the market were still PC-File and PC-Write, two of the first three ever released. But PC-Talk would have a shorter lifespan — and, much more tragically, so would its creator.


The founding staff of Macworld magazine. Andrew Fluegelman can just be seen at the very back, slightly left of center.

The PC World issue with the landmark review of PC-File was still on newsstands when Andrew Fluegelman had his next life-changing encounter with a computer: he was one of a select few invited to Apple for an early unveiling of the new Macintosh. He was so smitten by this whole new way of operating a computer that he immediately began lobbying for a companion magazine to PC World, to be named, naturally enough, Macworld. Its first issue appeared in time to greet the first Macintosh buyers early in 1984. Fluegelman held down the editor-in-chief job there even as he continued to fill the same role at PC World.

He was utterly unfazed to thus be straddling two encampments between which Apple was trying to foment a holy war. He spoke about the differences between the two aesthetics of computing in an interview that, like so much of what he said back then, rings disarmingly prescient today:

People [say the Macintosh is] more of a right-brain machine and all that. I think there is some truth to that. I think there is something to dealing with a graphical interface and a more kinetic interface; you’re really moving information around, you’re seeing it move as though it had substance. And you don’t see that on [an IBM] PC. The PC is very much a conceptual machine; you move information around the way you move formulas, elements on either side of an equation. I think there’s a difference.

I think the most important thing is to realize that computers are tools, that unless you want to become an expert programmer, the main thing that a computer provides you is the ability to express yourself. And if it’s letting you do that, if you now have hands on those tools, then you can be a force for good out in the world, doing the things that you used to do, that you’re still doing — representing your own ideas, not changing your persona to suddenly become a “computer person.”

And I think that may be the advantage of the Macintosh.

At bottom, Fluegelman himself wasn’t really a “computer person” in the sense of Button and Wallace, both of whom had been programming since the 1960s. And then, running not one but two of the biggest computer magazines in the country could hardly leave him with much free time. Thus PC-Talk was somewhat neglected, and other telecommunications software — some of it released under the burgeoning shareware model — took its place. Fluegelman accepted this with equanimity; he was never inclined to stay in one place for very long anyway. In an interview conducted at the very first Macworld Expo in January of 1985, he spoke of his excitement about the future — both his personal future and the world’s technological future:

I think this is just the next adventure for a lot of us to get into. I know the intellectual excitement the [computer] has caused for me. It’s really been a rejuvenation, and anything that gets you that pumped up has got to be something that you can use in a good way.

I also think that people who do get excited about computers and involved in all this are almost uniformly intelligent, interesting people. I never have been as socially involved, as interconnected with as many different kinds of people, as when I started getting involved with computers. I think that the easier it is for people to express themselves, and to share their views with others, that’s got to be a good democratic force.

It’s great to go along for 40 years and still find your life changing and new things happening. It makes you look forward to what’s going to happen when you’re 60, what’s going to happen when you’re 80.

Quotes like these are hard to square with what happened to Andrew Fluegelman just six months later.

On July 6, 1985, Fluegelman left his office as usual at the end of a working day, but never arrived at his home; he simply disappeared. A week later, police discovered his Mazda hatchback parked near the toll plaza at the entrance to the Golden Gate Bridge. They found a note addressed to his wife and family inside, but its contents have never been published. Nevertheless, we can piece some things together. It seems that his health hadn’t been good; he’d been suffering from colitis, for which he’d begun taking strong medication that was known to significantly impact many patients’ psychology — and, indeed, friends and colleagues in the aftermath mentioned that he’d been acting erratically in the final few days before his disappearance. There are reports as well that he may have recently received a cancer diagnosis. At any rate, the implications seem clear: the 41-year-old Andrew Fluegelman went back to one of his favorite places in the world — the bridge where he had invented the revolutionary concept of shareware if not the name — and jumped 220 feet into the water below. His body was never recovered.

The legacy of those brief four years between his discovery of the joys of BASIC and his death by suicide encompasses not only the shareware model but also PC World and especially Macworld. It went on to become arguably the most literate, thoughtful computer magazine ever, one of the vanishingly few to evince a genuine commitment to good writing in the abstract. In doing so, it merely held to the founding vision of its first editor-in-chief. One can’t help but wonder what else this force of nature might have done, had he lived.


At shareware’s peak in the early and mid-1990s, at least one glossy newsstand magazine was devoted exclusively to the subject in quite a number of countries.

By that fateful day in 1985, shareware was already becoming an unstoppable force, with more and more programmers throwing their hats into the ring. To be sure, most of them didn’t build seven-figure businesses out of it, as Jim Button and Bob Wallace did. Inevitably for a distribution model that placed all of its quality control on the back end, much of the shareware that was released wasn’t very good at all. Yet even many of those who didn’t get to give up their day jobs did receive the satisfaction and capitalistic validation of being paid real money, at least every once in a while, for something they had created. In time, this loose-knit band of fellow travelers began to take on the trappings of a movement.

To wit: in February of 1987, a “Meeting of Shareware Authors” assembled in Houston to chat and kibitz about their efforts. Out of that meeting grew the Association of Shareware Professionals six months later, with founding chairmen Jim Button and Bob Wallace. In the years that followed, the ASP published countless shareware catalogs and pamphlets; they even published a 780-page book in 1993 called The Shareware Compendium, which represented the last attempt anyone ever made to list in one place all of the staggering quantity of shareware that was available by that point. But perhaps even more importantly, the ASP acted as a social outlet for the shareware authors themselves, a way of sharing hints and tips, highs and lows, dos and don’ts with one another.

There arose more big success stories out of all this ferment. For example, one Phil Katz was responsible for what remains today the most tangible single software artifact of the early shareware scene. In 1986, he started a little company called PKWare to distribute a reverse-engineered shareware clone of ARC, the most popular general-purpose compression program of the time. When the owners of ARC came after him with legal threats, he switched gears and in 1989 released PKZIP, which used an alternative, much more efficient compression format of his own design. Although he sold PKZIP as shareware — $25 donation requested, $47 for a printed manual — he also scrupulously documented the compression format it used and left the door open for other implementations of it. He was rewarded with sweet revenge: ZIP quickly superseded ARC all across the digital world. Striking a fine balance between efficiency and ease of implementation, not to mention being unentangled by patents, it has remained the world’s most common compression format to this day, a de facto standard that is now built right into many operating systems.

Another success story is less earthshaking and more esoteric, but instructive nonetheless as an illustration of just how far the shareware model could be stretched. In a time when desktop publishing was one of the biggest buzzwords in computing, a veteran of print publishing named Gary Elfring took a hard look at the current state of digital fonts, and noted how expensive those offered by major foundries like Adobe tended to be. He started Elfring Soft Fonts to distribute shareware typefaces, and made a lot of money from them in the late 1980s and early 1990s, before the established vendors of word processors and operating systems got their acts together in that department.

I could go on and on with such stories, but suffice to say that many people did very, very well from shareware during its heyday.

Like any movement, shareware also came complete with internecine disputes. One constant source of tension were the many third parties who collected shareware which they didn’t own on physical media for distribution. As early as 1984, the librarian of the Silicon Valley Computer Society users group caused an uproar when he started selling floppy disks filled with shareware for $6 apiece, a figure somewhat above the cost of blank disks and postage alone. “It’s not legal,” said Andrew Fluegelman flatly at the time. “I’m opposed to it because when somebody spends even $6 for a disk, they feel they’ve paid for it and see little reason to pay again for it. I’m concerned about somebody building a product around my product.” But, in a rare break with Fluegelman, Jim Button had a different point of view: “With that [price], all he’s doing is helping me distribute sample copies.” He continued in later years to believe that “distribution is one of the cornerstones of sales. All other factors being equal, if you can double your distribution you will double your sales.”

In the end, Button’s point of view carried the day. Shareware authors were never entirely comfortable with the “parasites” who profited off their software in this way, and Fluegelman’s worry that many users would fail to distinguish between paying a cataloger and paying the actual creator of the software was undoubtedly well-founded. Yet the reality was that the vast majority of computer owners would not go online until the World Wide Web struck in the mid-1990s. In the meantime, floppy disks — and eventually CD-ROMs — were the only realistic mechanism for reaching all of these otherwise isolated users. The catalogers and the authors had to learn to live with one another in an uneasy symbiotic relationship.

Another, even more bitter dispute within the ranks of shareware was touched off near the end of the 1980s, when some authors started opting to “encourage” registration by releasing crippled versions of their software — programs that only functioned for a limited time, or that blocked access to important features — that could only have their full potential unlocked via the input of a valid registration code. Although Bob Wallace had ironically pioneered the idea of a registration code that was input directly into a program, he and most of the other early shareware pioneers hated to see the codes used in this way. For the socially conscious Wallace, it was a moral issue; his vision for shareware had always been to collect payment from those who could pay, but not to deprive those who couldn’t of quality software. Button as well preferred to rely upon the honor system: “Don’t get off on the wrong foot with your users with things like crippled programs, time-limited programs, and other negative incentives to register your software. If you can’t trust your users to pay for truly good software, then you should stay out of the shareware business.” Under the influence of these two founding chairmen, the ASP refused for a time to admit shareware authors who freely distributed only crippled versions of their software.

In the end, though, the ASP would be forced to relax their stance, and “crippleware” would become nearly synonymous with shareware in many circles, for better or for worse. In 1989, Nelson Ford, the earlier popularizer of the name “shareware,” set up a service for authors which let people register their software over the telephone using their credit cards instead of having to mail checks or cash through the post. The ease of passing out registration codes this way, without having to send out disks and/or documentation or do any additional work at all, probably led many more authors to go the crippleware route. In fairness to those who decided to implement such schemes, it should be noted that they didn’t have the advantages that went along with being first on the scene, and were often marketing to less committed computer users with a less nuanced sense of the ethics of intellectual property and the sheer amount of work that goes into making good software of any stripe.


In a strange sort of way, Windows 10 is actually a shareware product.

The buzz around shareware gradually faded in the second half of the 1990s, and by soon after the turn of the millennium the term was starting to seem like an antiquated relic of computing’s past. Even the Association of Shareware Professionals eventually changed their name to the Association of Software Professionals, before doddering off entirely. (A website still exists for the organization today, but it doesn’t appear to have been updated in some years.)

Yet it would be profoundly inaccurate to say that shareware died as anything but a name. On the contrary: it conquered the world to such an extent that it became the accepted means of distributing much or most software, and as such is no longer in need of any particular name. Just about everyone is selling shareware today — not only the sometimes honest, sometimes dodgy small vendors of “try before you buy” utilities of many types, but also some of the biggest corporations in the world. Microsoft, for example, now distributes Windows using what is essentially the shareware model: users download a copy for free, enjoy a limited trial period, and then need to purchase a registration code if they wish to go on using it. Many other software developers have stuck to their idealistic guns and put their creations out there uncrippled, asking for a donation only from those who can afford it. And, as I mentioned to open this piece, the overarching spirit of shareware, if you will, has infected countless digital economies that don’t involve downloads or registration keys at all.

Jim Button and Bob Wallace got to see some of these later developments, but they weren’t active participants in most of them. Wallace gradually divested himself from Quicksoft after 1990. Ever the hippie, he devoted his time to the study and promotion of psychedelic drugs and other “mind-expanding technologies” via publications and foundations. He died in 2002 at age 53 from a sudden attack of pneumonia that may or may not have been related to his quest for chemical transcendence.

Jim Button (né Knopf) very nearly died even younger. At the age of 49 in 1992, he had a major heart attack. He survived, but wasn’t sure that he could continue to cope with the stress of running his shareware business. At the time, big players like Microsoft were pouring enormous resources into their own productivity software, and the likes of little Buttonware had no real hope of competing with them anymore. This combination of factors prompted Button to slowly wind his company down; after all, his decade in shareware had already left him with enough money to enjoy a comfortable early retirement. He died in 2013, a few weeks shy of his 71st birthday. He continued until the end to downplay his role in the evolution of software distribution and digital culture. “I’m not a visionary man,” he said. “I never saw the future, but I was lucky enough to be in the right place at the right time, with the right ideas and a proper amount of energy.”

Some might say that the “right ideas” are synonymous with vision, but no matter; we’ll let him keep his modesty. What he and his fellow pioneers wrought speaks for itself. All you have to do is look around this place we call the Internet.

(Sources: the books The New Games Book by the New Games Foundation, Writing in the Computer Age by Andrew Fluegelman and Jeremy Joan Hewes, and Gates by Stephen Manes and Paul Andrews; Softalk IBM of May 1984, June 1984, July 1984, and August 1984; Byte of June 1976, June 1983, July 1984, March 1985, and September 1987; 80 Computing of May 1987; Ahoy! of February 1984; CompuServe Magazine of December 1990 and March 1992; Family Computing of March 1984; InfoWorld of July 5 1982, August 23 1982, December 20 1982, March 7 1983, May 30 1983, June 27 1983, July 30 1984, September 17 1984, October 22 1984, July 29 1985, December 23 1985, August 25 1986, and December 7 1987; MicroTimes of May 1985 and August 1985; Games Machine of October 1987; Compute! of February 1985 and June 1988; PC World of September 1983; Macworld premiere issue. Online sources include The Association of Software Professional’s website, Michael E. Callahan’s “History of Shareware” on Paul’s Picks, The Charley Project‘s entry on Andrew Fluegelman’s disappearance, the Shareware Junkies interview with Jim “Button” Knopf, “Jim Button: Where is He Now?” at Dr. Dobb’s, the M & R Technologies interview with Jim Knopf, the Brown Alumni Monthly obituary of Bob Wallace, and a 1989 online discussion of the newly released PKZip archived by Jason Scott. My thanks to Matthew Engle for giving me the picture of Shareware Magazine included in this article.)

 
 

Tags: , ,

The (7th) Guest’s New Clothes

Once upon a time, two wizards decided to remake the face of computer gaming with the help of a new form of magic known as CD-ROM. They labored for years on their task, while the people waited anxiously, pouncing upon the merest hint the wizards let drop of what the final product would look like.

At long last — well after the two wizards themselves had hoped — the day of revelation came. Everyone, including both the everyday people and the enlightened scribes who kept them informed on the latest games, rushed to play this one, which they had been promised would be the best one ever. And at first, all went as the wizards had confidently expected. The scribes wrote rapturously about the game, and hordes of people bought it, making the wizards very rich.

But then one day a middle-aged woman, taking a break from reckoning household accounts by playing the wizards’ game, said to her husband, “You know, honey, this game is really kind of slow and boring.” And in time, a murmur of discontent spread through many ranks of the people, gaining strength all the while. The cry was amplified by a disheveled young man with a demon of some sort on his tee-shirt and a fevered look in his eyes: “That’s what I’ve been saying all along! The wizards’ game sucks! Play this one instead!” And he hunched back down over his computer to continue playing his very different sort of game, muttering something about “gibs” and “frags” as he did so.

The two wizards were disturbed by this growing discontent, but resolved to win the people over with a new game that would be just like their old one, except even more beautiful. They worked on it too for years to make it as amazing as possible. Yet when they offered it to the people, exponentially fewer of them bought it than had bought their first game, and their critics grew still louder and more strident. They tried yet one more new game of the same type, yet more beautiful, but by now the people had lost interest entirely; few could even be bothered to criticize it. The wizards started bickering with each other, each blaming the other for their failures.

One of the wizards, convinced he could do better by himself, went away to make still more games of the same type, but the people remained stubbornly uninterested; he finally gave up and found another occupation. From time to time, he tries again to see if the people want another game like the one they seemed to love so much on that one occasion long ago, but he is invariably disappointed.

The other wizard — perhaps the wiser of the two — said, “If you can’t beat ’em, join ’em.” He joined the guild that included the violent adolescent with the demon on his shirt, and enjoyed a return to fortune if not fame.

Such is the story of Trilobyte Games in a nutshell. Today, we remember 1993 as the year that Cyan Productions and id Software came to the fore with Myst and Doom, those two radically different would-be blueprints for gaming’s future. But we tend to forget that the most hyped company and game of the year were in fact neither of those pairings: they were rather Trilobyte and their game The 7th Guest. Echoing the conventional wisdom of the time, Bill Gates called The 7th Guest “the future of multimedia,” and some even compared Graeme Devine and Rob Landeros, the two “wizards” who had founded Trilobyte together, to John Lennon and Paul McCartney. Sadly for the wizards, however, The 7th Guest had none of the timeless qualities of the Beatles’ music; it was as of its own time as hula hoops, love beads, or polyester leisure suits were of theirs.


Rob Landeros and Graeme Devine

Unlike their alter egos in the Beatles, Graeme Devine and Rob Landeros grew up in vastly different environments, separated not only by an ocean but by the equally enormous gulf of seventeen years.

Born in Glasgow, Scotland, in 1966, Devine was one of the army of teenage bedroom coders who built the British games industry from behind the keyboards of their Sinclair Spectrums. His first published work was actually a programming utility rather than a game, released as part of a more complete Speccy programmer’s toolkit by a company known as Softek in the spring of 1983. But it was followed by his shoot-em-up Firebirds just a few months later. That game’s smooth scrolling and slick presentation won him a reputation. Thus one day the following year the phone rang at his family’s home; a representative from Atari was on the line, asking if he would be free to port their standup-arcade and console hit Pole Position to the Spectrum.

Over the next several years, Devine continued to port games from American publishers to the Europe-centric Spectrum, while also making more original games of his own: Xcel (1985), Attack of the Killer Tomatoes (1986), Metropolis (1987). His originals tended to be a bit half-baked once you really dove in, but their technical innovations were usually enough to sustain them, considering that most of them only cost a few quid. Metropolis, the first game Devine programmed for MS-DOS machines, provides a prime example of both his technical flair and complete lack of detail orientation. A sort of interactive murder mystery taking place in a city of robots, sharing only a certain visual sensibility with the Fritz Lang film classic of the same name, it includes almost-decipherable “voice acting” for its characters, implemented without the luxury of a sound card, being played entirely through the early IBM PC beeper. The game itself, on the other hand, is literally unfinished; it breaks halfway through its advertised ten cases. Perhaps Devine decided that, given that he included no system for saving his rather time-consuming game, no one would ever get that far anyway.

Metropolis

Metropolis was published through the British budget label Mastertronic, whose founder Martin Alper was a force of nature, famous as a cultivator of erratic young talent like Devine. Alper sold Mastertronic to Richard Branson’s Virgin Media empire just after Metropolis was released, and soon after that absconded to Southern California to oversee the newly formed American branch of Virgin Games. On a routine visit back to the Virgin mother ship in London in 1988, he dropped in on Devine, only to find him mired in a dark depression; it seemed his first serious girlfriend had just left him. “England obviously isn’t treating you well,” said Alper. “Why don’t you come with me to California?” Just like that, the 22-year-old Devine became the head of Virgin Games’s American research and development. It was in that role that he met Rob Landeros the following year.

Landeros’s origin story was about as different from Devine’s as could be imagined. Born in 1949 in Redlands, California, he had lived the life of an itinerant bohemian artist. After drifting through art school, he spent much of the 1970s in hippie communes, earning his keep by drawing underground comic books and engraving tourist trinkets. By the early 1980s, he had gotten married and settled down somewhat, and found himself fascinated by the burgeoning potential of the personal computer. He bought himself a Commodore 64, learned how to program it in BASIC, and even contributed a simple card game to the magazine Compute!’s Gazette in the form of a type-in listing.

But he remained a computer hobbyist only until the day in early 1986 that an artist friend of his by the name of Jim Sachs showed him his new Commodore Amiga. Immediately struck by the artistic possibilities inherent in the world’s first true multimedia personal computer, Landeros worked under Sachs to help illustrate Defender of the Crown, the first Amiga game from a new company called Cinemaware. After that project, Sachs elected not to stay on with Cinemaware, but instead recommended Landeros for the role of the company’s art director. Landeros filled that post for the next few years, illustrating more high-concept “interactive movies” which could hardly have been more different on the surface from Devine’s quick-and-dirty budget games — but which nevertheless tended to evince some of the same problems when it came to the question of their actual gameplay.

Whatever its flaws in that department, Martin Alper over at Virgin was convinced that the Cinemaware catalog was an early proof of concept for gaming’s future. As Cinemaware founder Bob Jacob and many others inside and outside his company well recognized, their efforts were hobbled by the need to rely on cramped, slow floppy disks to store all of their audiovisual assets and stream them into memory during play. But with CD-ROM on the horizon for MS-DOS computers, along with new graphics and sound cards that would make the platform even more audiovisually capable than the Amiga, that could soon be a restriction of the past. Alper asked Devine to interview Landeros for the role of Virgin’s art director.

Landeros was feeling “underappreciated and underpaid” at Cinemaware, as he puts it, so he was very receptive to such an offer. When he called Devine back after hearing the message the latter had left on his answering machine, he found the younger man in an ebullient mood. He had just gotten engaged to be married, Devine explained, to a real California girl — surely every cloistered British programmer’s wildest fantasy. Charmed by the lad’s energy and enthusiasm, Landeros let himself be talked into a job. And indeed, Devine and Landeros quickly found that they got on like a house on fire.

Tall and skinny and bespectacled, with unkempt long hair flying everywhere, Devine alternated the euphoria with which he had first greeted Landeros with bouts of depression such as the one Martin Alper had once found him mired in.  Landeros was calmer, more grounded, as befit his age, but still had a subversive edge of his own. When you first met him, he had almost a patrician air — but when he turned around for the first time, you noticed a small ponytail snaking down his back. While Devine was, like so many hackers, used to coding for days or weeks on end, sometimes to the detriment of his health and psychological well-being, Landeros needed a very good reason indeed to give up his weekend motorcycle tours. Devine was hugely impressed by Landeros’s tales of his free-spirited life, as he was by the piles of self-inked comic books lying about his home; Landeros was repeatedly amazed simply at the things Devine could make computers do. The two men complemented each other — perhaps were even personally good for one another in some way that transcends job and career.

Their work at Virgin, however, wasn’t always the most exciting. The CD-ROM revolution proved late in arriving; in the meantime, the business of making games continued pretty much as usual. In between his other duties, Devine made Spot, an abstract strategy game which betrayed a large debt to the ancient Japanese board game of Go whilst also serving as an advertisement for the soft drink 7 Up; if not quite a classic, it did show more focus than his earlier efforts. Meanwhile Landeros did the art for a very Cinemaware-like cross-genre concoction called Spirit of Excalibur. In his spare time, he also helped his friend and fellow Cinemaware alumnus Peter Oliphant with a unique word-puzzle/game-show hybrid called Lexi-Cross. (Rejected by Alper because “game shows need a license in order to sell,” it was finally accepted by Interplay after that company’s head Brian Fargo brought a copy home to his wife and she couldn’t stop playing it. Nonetheless, it sold hardly at all, just as Alper had predicted.)

Devine and Landeros were itching to work with CD-ROM, but everywhere they went they were told that the market just wasn’t there yet. As they saw it, no one was buying CD-ROM drives because no one was making compelling enough software products for the new medium. It was a self-fulfilling prophecy, a marketplace Gordian knot which someone had to break. Accordingly, they decided to put together their own proposal for a showpiece CD-ROM game. Both were entranced by Twin Peaks, the darkly quirky murder-mystery television series by David Lynch, which had premiered in the spring of 1990 and promptly become an unlikely mass-media sensation. Sitting in the airport together one day, they overheard the people around them debating the question of the year: who killed Laura Palmer?

Imagine a game that can fascinate in the same way, mused Devine. And so they started to brainstorm. They pictured a game, perhaps a bit like the board game Clue — tellingly, the details of the gameplay were vague in their minds right from the start — that might make use of a Twin Peaks license if such a thing was possible, but would go for that sort of vibe regardless. Most importantly, it would pull out all the stops to show what CD-ROM — and only CD-ROM — could do; there would be no floppy version. Indeed, the project would be thoroughly uncompromising in all of its hardware requirements, freeing it from the draconian restrictions that came with catering to the lowest common denominator. It would require one of a new generation of so-called “Super” VGA graphics cards, which would let it push past the grainy resolution of 320 X 200, still the almost universal standard in games, to a much sharper 640 X 480.

To keep the development complications from spiraling completely out of control, it could take place in a haunted house that had a group of people trapped inside, being killed one by one. Sure, Agatha Christie had done it before, but this would be different. Creepier. Darker. A ghost story as well as a mystery, all served up with a strong twist of David Lynch. “Who killed Laura Palmer? Who killed Laura Palmer? We wanted to create that sort of intrigue,” remembers Landeros.

When they broached the possibility of a Twin Peaks game with Alper, he was definitive on one point: there wasn’t enough room in his budget to acquire a license to one of the hottest media properties in the country. They should therefore focus their thinking on a Twin Peaks-like game, not the real thing. Otherwise, he was noncommittal. “Give me a detailed written proposal, and we’ll see,” he said.

At this point in our story, it would behoove us to know something more of Martin Alper the man, a towering figure whose shadow loomed large over all of Virgin Games. A painter and sculptor of some talent during his free time, Alper was also an insatiable culture vulture, reading very nearly a novel per day and seeing several films per week. His prodigious consumption left no space for games. “I’ve never played any game,” he liked to boast. “What interests me is the cultural progress that games can generate. I’m looking to make a difference in society.” He liked to think of himself as a 1990s incarnation of Orson Welles, nudging his own group of Mercury Players into whole new fields of creative expression. When Devine and Landeros’s detailed proposal landed on his desk in November of 1990, full of ambition to harness the current zeitgeist in the service of a new medium, it hit him right where he lived. Even the proposed budget of $300,000 — two to three times that of the typical Virgin game — put him off not at all.

So, he invited Devine and Landeros to a lunch which has since gone down in gaming lore. After the niceties had been dispensed with, he told the two bluntly that they had “no future at Virgin Games.” He enjoyed their shock for a while — a certain flair for drama was also among his character traits — then elaborated. “Your idea is too big to be developed here. If you stayed here, you’d quickly overrun our offices. I can’t afford to let you do that. Other games have to be made here as well.”

“What do you suggest?” ventured Devine.

And so Alper laid out his grand plan. They should start their own studio, which Virgin Games would finance. They could work where they liked and hire whomever they liked, as long as the cost didn’t become too outrageous and as long as they stayed within 90 minutes of Virgin’s headquarters, so that Alper and David Bishop, the producer he planned to assign to them, could keep tabs on their progress. And they would have to plan for the eventuality of a floppy-disk release as well, if, as seemed likely, CD-ROM hadn’t yet caught on to a sufficient degree with consumers by the following Christmas, the game’s proposed release date. They were simple requirements, not to mention generous beyond Devine and Landeros’s wildest dreams. Nevertheless, they would fail to meet them rather comprehensively.

In the course of his hippie wanderings, Landeros had fallen in love with the southern part of Oregon. After the meeting with Alper, he suggested to Devine that they consider setting up shop there, where the biking and motorcycling were tremendous, the scenery was beautiful, the people were mellow, and the cost of living was low. When Devine protested that one certainly couldn’t drive there from Virgin’s offices within 90 minutes, Landeros just winked back. Alper hadn’t actually specified a mode of transportation, he noted. And one could just about fly there in an hour and a half.

On December 5, 1990, the pair came for the first time to Jacksonville, Oregon, a town of just 2000 inhabitants. It so happened that the lighting of the town Christmas tree was taking place that day. All of the people had come out for the occasion, dressed in Santa suits and Victorian costumes, caroling and roasting chestnuts. Just at sunset, snow started to fall. Devine, the British city boy far from home, looked around with shining eyes at this latest evolution of his American dream. Oregon it must be.

So, during that same visit, they signed a lease on a small office above a tavern in an 1884-vintage building — wood floors, a chandelier on the ceiling, even a fireplace. They hired Diane Moses, a waitress from the tavern below, to serve as their office manager. Then they went back south to face the music.

The 7th Guest was created in this 1884-vintage building in Jacksonville, Oregon, above a tavern which is now known as Boomtown Saloon.

Alper was less than pleased at first that they had so blatantly ignored his instructions, but they played up the cheap cost of living and complete lack of distractions in the area until he grudgingly acquiesced. The men’s wives were an even tougher sell, especially when they all returned to Jacksonville together in January and found a very different scene: a bitter cold snap had caused pipes to burst all over town, flooding the streets with water that had now turned to treacherous ice, making a veritable deathtrap of the sidewalk leading up to their new office’s entrance. But the die was now cast, for better or for worse.

The studio which Devine and Landeros had chosen to name Trilobyte officially opened for business on February 1, 1991. The friends found that working above a tavern had its attractions after a long day — and sometimes even in the middle of one. “It’s fun to watch the fights spill out onto the street,” said Devine to a curious local newspaper reporter.

The first pressing order of business was to secure a script for a game that was still in reality little more than a vague aspiration. Landeros had already made contact over the GEnie online service with Matthew Costello, a horror novelist, gaming journalist, and sometime tabletop-game designer. He provided Trilobyte with a 100-page script for something he called simply Guest. Graeme Devine:

We presented the basic story to Matt, and he made it into a larger story, built the characters and the script. He created it out of what was really just a sketch. We were anxious that the [setting] be very, very closed. One that would work as a computer environment. That’s what he gave us.

The script took place within a single deserted mansion, and did all of its storytelling through ghostly visions which the player would bump into from time to time, and which could be easily conveyed through conveniently non-interactive video snippets. Like so many computer games, in other words, Guest would be more backstory than story.

Said backstory takes place in 1935, and hinges on a mysterious toy maker named Henry Stauf — the anagram of Faust is intentional — who makes and sells a series of dolls which cause all of the children who play with them to sicken and die. When the people of his town figure out the common thread that connects their dead children, they come for him with blood in their eyes. He barricades himself in his mansion to escape their wrath — but sometime shortly thereafter he lures six guests into spending a night in the mansion, with a promise of riches for those who survive. Falling victim either to Stauf’s evil influence or their own paranoia, or both, the six guests all manage to kill one another, Agatha Christie-style, over the course of the night, all without ever meeting Stauf himself in the flesh. But there is also a seventh, uninvited guest, a street kid named Tad who sneaks in and witnesses all of the horror, only to have his own soul trapped inside the mansion. It becomes clear only very slowly over the course of the game that the player is Tad’s spirit, obsessively recapitulating the events of that night of long ago, looking for an escape from his psychic prison in the long-deserted mansion.

The backstory of how Stauf came to take up residence in his mansion is shown in the form of narrated storybook right after the opening credits.

The only thing missing from Costello’s script was any clear indication of what the player would be expected to do in the course of it all. Trilobyte planned to gate progress with “challenges to the player’s intellect and curiosity. Our list of things to avoid includes: impossible riddles, text parsers, inventories, character attribute points, sword fights, trolls, etc. All actions are accomplished via mouse only. Game rules will either be self-explanatory or simple enough to discover with minimal experimentation.” It sounded good in the abstract, but it certainly wasn’t very specific. Trilobyte wouldn’t seriously turn to the game part of their game for a long, long time to come.

The question of Guest‘s technical implementation was almost as unsettled, but much more pressing. Devine and Landeros first imagined showing digitized photographs of a real environment. Accordingly, they negotiated access to Jacksonville’s Nunan House, a palatial three-story, sixteen-room example of the Queen Anne style, built by a local mining magnate in 1892. But, while the house was fine, the technology just wouldn’t come together. Devine had his heart set on an immersive environment where you could see yourself actually moving through the house. Despite all his technical wizardry, he couldn’t figure out how to create such an effect from a collection of still photographs.


The Mansion

The Nunan House in Jacksonville, Oregon, whose exterior served as the model for the Stauf Mansion. The interior of the latter was, however, completely different, with the exception only of a prominent central staircase.



A breakthrough arrived when Devine and Landeros shared their woes with a former colleague from Virgin, an artist named Robert Stein. Stein had been playing for several months with 3D Studio, a new software package from a company known as Autodesk which let one build and render 3D scenes and animations. It was still an awkward tool in many ways, lagging behind similar packages for the Commodore Amiga and Apple Macintosh. Nonetheless, a sufficiently talented artist could do remarkable things with it, and it had the advantage of running on the MS-DOS computers on which Trilobyte was developing Guest. Devine and Landeros were convinced when Stein whipped up a spooky living room for them, complete with a ghostly chair that flew around of its own accord. Stein soon came to join them in Jacksonville, becoming the fourth and last inhabitant of their cozy little office.


3D Studio

The 7th Guest was the first major game to make extensive use of Autodesk’s 3D Studio, a tool that would soon become ubiquitous in the industry. Here we see the first stage of the modeling process: the Shaper, in which an object is created as a two-dimensional geometric drawing, stored in the form of points and vectors.

In the Lofter, an object’s two dimensions are extruded into three, as the X- and Y-coordinates of its points are joined to Z-coordinates.

The Materials Editor is used to apply textured surfaces to what were previously wire-frame objects.

The 3D Editor is used to build a scene by hanging objects together in a virtual space and defining the position, color, and intensity of light sources.

The Keyframer is used to create animation. The artist arranges the world in a set of these so-called key frames, then tells the computer to extrapolate all of the frames in between. The process was an extremely time-consuming one on early-1990s computer hardware; each frame of a complex animation could easily take half an hour to render.



Even using 3D Studio, Guest must fall well short of the ideal of an immersive free-scrolling environment. At the time, only a few studios — most notably Looking Glass Technologies and, to a much more limited extent, id Software of eventual Doom fame — were even experimenting with such things. The reality was that making interactive free-scrolling 3D work at all on the computer hardware of the era required drastic compromises in terms of quality — compromises which Trilobyte wasn’t willing to make. Instead they settled for a different sort of compromise, in the form of a node-based approach to movement. The player is able to stand only at certain pre-defined locations, or nodes, in the mansion. When she clicks to move to another node, a pre-rendered animation plays, showing her moving through the mansion.

Just streaming these snippets off CD fast enough to play as they should taxed Devine’s considerable programming talents to the utmost. He would later muse that he learned two principal things from the whole project: “First, CD-ROM is bloody slow. Second, CD-ROM is bloody slow.” When he could stretch his compression routines no further, he found other tricks to employ. For example, he got Landeros to agree to present the environment in a “letter-boxed” widescreen format. Doing so would give it a sense of cinematic grandeur, even as the black bars at the top and bottom of the monitor dramatically reduced the number of pixels Devine’s routines had to move around. A win win.

With the interior of the mansion slowly coming into being, the time was nigh to think about the ghostly video clips which would convey the story. Trilobyte recruited local community-theater thespians to play all the parts; with only $35,000 to spend on filming, including the camera equipment, they needed actors willing to work for almost nothing. The two-day shoot took place in a rented loft in Medford, Oregon, on a “stage” covered with green butcher paper. The starring role of Stauf went to Robert Hirschboeck, a fixture of the annual Oregon Shakespeare Festival, which was (and is) held in nearby Ashland. Diane Moses, Trilobyte’s faithful office manager, also got a part.

Robert Hirschboeck, the semi-professional Shakespearean actor who played the role of Stauf in The 7th Guest and its sequel. He was bemused by the brief fame the role won him: “I’ll be walking down the street and meet someone with all the CD-ROM gear, and they’ll say, ‘Ah, man, I’ve been looking at your ugly mug for 60 hours this week.'”

Trilobyte believed, with some justification, that their game’s premise would allow them to avoid some of the visual dissonance that normally resulted from overlaying filmed actors onto computer-generated backgrounds: their particular actors represented ghosts, which meant it was acceptable for them to seem not quite of the world around them. To enhance the impression, Trilobyte added flickering effects and blurry phosphorescent trails which followed the actors’ movements.


The Chroma-Key Process

A technique known as chroma-keying was used by The 7th Guest and most other games of the full-motion-video era to blend filmed actors with computer-generated backgrounds. The actor is filmed in front of a uniform green background. After digitization, all pixels of this color are rendered transparent. (This means that green clothing is right out for the actors…)

Meanwhile a background — the “stage” for the scene — has been created on the computer.

Finally, the filmed footage is overlaid onto the background.



While Trilobyte built their 3D mansion and filmed their actors, the project slipped further and further behind schedule. Already by May of 1991, they had to break the news to Alper that there was no possibility of a Christmas 1991 release; Christmas 1992 might be a more realistic target. Luckily, Alper believed in what they were doing. And the delay wasn’t all bad at that; it would give consumers more time to acquire the SVGA cards and CD-ROM drives they would need to run Guest — for by now it was painfully clear that a floppy-disk version of the game just wasn’t going to happen.

In January of 1992, Devine, Landeros, and Stein flew to Chicago for the Winter Consumer Electronics Show. They intended to keep a low profile; their plan was simply to check out the competition and to show their latest progress to Alper and his colleagues. But when he saw what they had, Alper broke out in goosebumps. Cinema connoisseur that he was, he compared it to Snow White and the Seven Dwarfs, Walt Disney’s first feature film, which forever changed the way people thought about cartoon animation. What Snow White had done for film, Alper said, Guest could do for games. He decided on the spot that it needed to be seen, right there and then. So, he found a computer on the show floor that was currently demonstrating a rather yawn-inducing computerized version of Scrabble and repurposed it to show off Guest. To make up for the fact that Trilobyte’s work had no music as of yet, he put on a CD of suitably portentous Danny Elfman soundtrack extracts to accompany it.

Thanks to this ad hoc demonstration, Guest turned into one of the most talked-about games of the show. Its stunning visuals were catnip to an industry craving killer apps that could nudge reluctant consumers onto the CD-ROM bandwagon. Bill Gates hung around the demo machine like a dog close to feeding time. Virgin’s competitor Origin Systems, of Wing Commander and Ultima fame, also sat up and took notice. They highlighted Guest as the game to watch in their internal newsletter:

Here’s a tip: keep an eye out for Guest, a made-for-CD-ROM title from Oregon developer Trilobyte for Virgin Games. In it, you explore a 22-room haunted mansion, complete with elaborate staircases, elegant dining rooms, a gloomy laboratory, and see-through ghosts. The version we saw is in a very primitive stage; there’s no real story line yet and many of the rooms are only rendered in black and white. But the flowing movement and brilliant detail in a few scenes which are fleshed-out are nothing less than spectacular. Ask anybody who saw it.

None of the press or public seemed to even notice that it was far from obvious what the player was supposed to do amidst all the graphical splendor, beyond the vague notion of “exploring.” The Trilobyte trio flew back to Oregon thoroughly gratified, surer than ever that all of their instincts had been right.

Still, with publicity came expectations, and also cynicism; Bill Gates’s enthusiasm notwithstanding, a group of multimedia experts at Microsoft said publicly that what Trilobyte was proposing to do was simply impossible. Some believed the entire CES demo had been a fake.

Trilobyte remained a tiny operation: there were still only Devine, Landeros, Stein, and Moses in their digs above the tavern. Other artists, as well as famed game-soundtrack composer George “The Fat Man” Sanger, worked remotely. But Devine, who had always been a lone-wolf coder, refused to delegate any of his duties now, even when they seemed about to kill him. “I’ve never seen someone work so hard on a project,” remembers one Virgin executive. The Fat Man says that “Graeme wanted to prove everyone else a liar. He knew he was going to be able to do it.” This refusal to delegate began to cause tension with Alper and others at Virgin, especially as it gradually became clear that Trilobyte was going to miss their second Christmas deadline as well. Virgin had now sunk twice the planned $300,000 into the project, and the price tag was still climbing. Incredibly, Trilobyte’s ambitions had managed to exceed the 650 MB of storage space on a single CD, a figure that had heretofore seemed inconceivably enormous to an industry accustomed to floppy disks storing barely 1 MB each; Guest was now to ship on two CDs. Devine and Landeros agreed to work without salary to appease their increasingly impatient handlers.

Only in these last months did an already exhausted Devine and Landeros turn their full attention to the puzzles that were to turn their multimedia extravaganza into a game. Trilobyte was guided here by a simple question: “What would Mom play?” They found to their disappointment that many of the set-piece puzzles and board and card games they wanted to include were still under copyright. Their cutting-edge game would have to be full of hoary puzzles plundered from Victorian-era texts.

But at least Trilobyte could now see the light at the end of the tunnel. In January of 1993, they made a triumphant return to CES, this time with far more pomp and circumstance, to unveil the game they were now calling The 7th Guest. Alper sprang for a haunted-house mock-up in the basement of the convention hall, to which only a handpicked group of VIPs were admitted for a “private screening.” Bill Gates was once again among those who attended; he emerged a committed 7th Guest evangelist, talking it up in the press every chance he got. And why not? It blew Sherlock Holmes Consulting Detective, the current poster child for CD-ROM gaming, right out of the water. Sherlock‘s herky-jerky video clips, playing at a resolution of just 160 X 100, paled next to The 7th Guest‘s 3D-rendered SVGA glory.

When it was finally released in April of 1993, the reaction to The 7th Guest exceeded Virgin and Trilobyte’s fondest hopes. Virgin began with a production run of 60,000, of which they would need to sell 40,000 copies to break even on a final development budget of a little over $700,000. They were all gone within days; Virgin scrambled to make more, but would struggle for months to keep up with demand. “Believe it or not, The 7th Guest really does live up to all the hype,” wrote Video Games and Computer Entertainment magazine. “It takes computer entertainment to the next level and sets new standards for graphics and sound.” What more could anyone want?



Well, in the long run anyway, a lot more. The 7th Guest would age more like raw salmon than fine wine. Already just two and a half years after its release to glowing reviews like the one just quoted, the multimedia trade magazine InterAction was offering a much more tepid assessment:

As a first-generation CD-ROM-based experience, The 7th Guest broke new ground. It also broke a lot of rules – of course, this was before anyone knew there were any rules. The music drowns out the dialog; the audio is not mixable. The video clips, once triggered, can’t be interrupted, which in a house of puzzles and constant searching leads to frustration. How many times can you watch a ghost float down a hallway before you get bored?

Everywhere The 7th Guest evinces the telltale signs of a game that no one ever bothered to play before its release — a game the playing of which was practically irrelevant to its real goals of demonstrating the audiovisual potential of the latest personal computers. Right from the moment you boot it up, when it subjects you to a cheesy several-seconds-long sound clip you can’t click past, it tries your patience. The Ouija Board used to save and restore your session seems clever for about half a minute; after that’s it’s simply excruciating. Ditto the stately animations that sweep you through the mansion like a dancing circus elephant on Quaaludes; the video clips that bring everything to a crashing halt for a minute or more at a time; the audio clips of Stauf taunting you which are constantly freezing the puzzles you’re trying to solve. The dominant impression the game leaves you with is one of slowness: the slowness of cold molasses coming out of the jar, of a glacier creeping over the land, of the universe winding down toward its heat death. I get fidgety just thinking about it.

One of the game’s few concessions to player convenience is this in-game map. Yet it’s made so annoying to use that you hardly want to. First, you have to click through a menu screen which forces you to watch it tediously fading in and out, like every screen in the game. And then you have to watch the game fill in the map with colors square by exasperating square to indicate where you’ve solved the puzzles and where you still have puzzles remaining. This game would make an excellent trial of patience for a Zen school, if such institutions exist.

The puzzles that are scattered through the rooms of the mansion gate your progress, but not for any reason that is discernable within the environment. When you solve certain puzzles, the game simply starts letting you go places you couldn’t go before. In practice, this means that you’re constantly toing and froing through the mansion, looking for whatever arbitrary new place the game has now decided to let you into. And, as already noted, moving around takes forever.

The puzzles themselves were already tired in 1993. Landeros has been cheeky enough to compare The 7th Guest to The Fool’s Errand, Cliff Johnson’s classic Macintosh puzzler, but the former’s puzzles haven’t a trace of the latter’s depth, grace, wit, or originality. Playing The 7th Guest exposes a pair of creators who were, despite being unquestionably talented in other ways, peculiarly out of their depth when it came to the most basic elements of good game design.

For example, one of the puzzles, inevitably, is an extended maze, which the vast majority of players solve, assuming they do so at all, only through laborious trial and error. “The solution to the maze was on a rug in one of the bedrooms,” notes Devine. “We thought people would copy that down.” A more experienced design team would have grasped that good game design requires consistency: all of the other puzzles in the game are completely self-contained, a fact which has trained the player long before she encounters the maze not to look for clues like this one in the environment. Alternately, testers could have told the designers the same thing. The 7th Guest provides yet one more illustration of my maxim that the difference between a bad and a good one is the same as that between a game that wasn’t played before its release and one that was. “Our beta testing was, well, just us,” admits Devine.

Another infamous lowlight — easily the worst puzzle in the game in purely abstract design terms — is a shelf of lettered soup cans which you must rearrange to spell out a message. The problem is that the sentence you’re looking for makes sense only under a mustily archaic Scottish diction that vanishingly few players are likely to be familiar with.

But the worst puzzle in practical terms is actually Devine’s old abstract strategy game Spot, imported wholesale, albeit with the intelligence of your computer opponent cranked up to literally superhuman levels. It’s so difficult that even the official strategy guide throws up its hands, offering only the following clarification: “It is not necessary to beat this game to advance through The 7th Guest, and you will not be missing anything if you can’t beat it. To our knowledge, nobody has a consistent strategy to beat this game, not even Graeme!” The most serious problem here, even beyond the sheer lunacy of including a mini-game that even the programmer doesn’t know how to beat, is that the player doesn’t know that the puzzle is unnecessary. Thus she’s likely to waste hours or days on an insurmountable task, thinking all the while that it must gate access to a critical part of the plot, just like all the other puzzles. (What did I say about consistency?) Its presence is unforgivably cruel, especially in a game that advertised itself as being suitable for casual players.

None of the other puzzles are quite as bad as these, but they are samey —  three of the 22 are chess puzzles, doubtless all drawn from the same Victorian book — at wild variance with one another in difficulty, and just generally dull, in addition to being implemented in ways calculated to maximize their tedium. Playing the game recently to prepare for this article, I never once felt that rush that accompanies the solution of a really clever puzzle. Working through these ones does indeed feel like work, made all the more taxing by the obstinately form-over-function interface. The best thing to be said about the puzzles is that they can all be bypassed by consulting an in-game hint book in the mansion’s library, albeit at the cost of missing the video clips that accompany their successful solutions and thus missing out on that part of the plot.

Still, one might want to argue that there is, paradoxical though it might sound, more to games than gameplay. Aesthetics have a value of their own, as does story; certainly The 7th Guest is far from the first adventure game with a story divorced from its puzzles. In all of these areas as well, however, it’s long since curdled. The graphics, no longer able to dazzle the jaded modern eye with their technical qualities, stand revealed as having nothing else to offer. There’s just nothing really striking in the game’s visual design — no compelling aesthetic vision. The script as well manages only to demonstrate that Matthew Costello is no David Lynch. It turns out that subversive surrealistic horror is harder to pull off than it looks.

As for the actors… I hesitate to heap too much scorn on them, given that they were innocent amateurs doing their best with a dodgy script in what had to feel like a thoroughly strange performing situation. Suffice to say, then, that the acting is about as good as that description would suggest. On the other hand, it does seem that they had some fun at least some of the time by hamming it up.


Indeed, the only claim to aesthetic or dramatic merit which The 7th Guest can still make is that of camp. Even Devine acknowledges today that the game is more silly than scary. He now admits that the story is “a bit goofy” and calls the game “Scooby Doo spooky” rather than drawing comparisons to The Shining and The Haunting, as he did back in the day. Which is progress, I suppose — but then, camp is such a lazy crutch, one that far too many games try to lean upon.



The 7th Guest just kept selling and selling,” says its producer David Bishop of the months after its release. “We’d look at the sales charts and it had incredible legs. Sales were picking up, not slowing down.” By the end of 1996, the game would sell well over 2 million copies.  Trilobyte was suddenly flush with cash; they earned $5 million in royalties in the first year alone. Nintendo gave them a cool $1 million upfront for the console rights; Paul Allen came along with another $5 million in investment capital. Trilobyte moved out of their little office above the tavern into a picturesque old schoolhouse, and started hiring the staff that had been so conspicuously missing while they made their first game. Then they moved out of the schoolhouse into a 29,000-square-foot monstrosity, formerly a major bank’s data center.

The story of Trilobyte after The 7th Guest becomes that of two merely smart men who started believing that they really were the infallible geniuses they were being hyped as. “Trilobyte thought they could pick up any project and it would turn to gold,” says one former Virgin staffer. “They had huge egos and wanted to grow,” says another. Even writer Matthew Costello says that he “could see the impact the attention from The 7th Guest had on [Devine and Landeros’s] perceptions of themselves.”

Despite the pair’s heaping level of confidence and ambition, or perhaps because of it, Trilobyte never came close to matching the success of The 7th Guest. The sequel, called The 11th Hour, shipped fully two and a half years later, but nonetheless proved to be just more of the same: more dull puzzles, more terrible acting, more technically impressive but aesthetically flaccid graphics. The zeitgeist instant for this sort of thing had already passed; after a brief flurry of early sales, The 11th Hour disappeared. Other projects came and went; Trilobyte spent $800,000 on Dog Eat Dog, a “workplace-politics simulator,” before cancelling it. Meanwhile Clandestiny, another expensive game in the mold of The 7th Guest, sold less than 20,000 copies to players who had now well and truly seen that the guest had no clothes.

Dog Eat Dog, Trilobyte’s never-released “workplace-politics simulator.”

Rob Landeros gradually revealed himself to be a frustrated filmmaker, always a dangerous thing to have around a game-development studio. Worse, he was determined to push Trilobyte into “edgy” content, rife with adult themes and nudity, which he lacked sufficient artistic nuance to bring to life in ways that didn’t feel crass and exploitative. When Devine proved understandably uncomfortable with his direction, the two fast friends began to feud.

The two founders were soon pulling in radically different directions, with Landeros still chasing the interactive-movie unicorn as if Doom had never happened, while Devine pushed for a move into real-time 3D games like the ones everyone else was making. New Media magazine memorably described Landeros’s Tender Loving Care as “a soft-porn film with a weak plot and rancid acting” after getting a sneak preview; the very name of Devine’s Extreme Warfare sounded like a caricature of bro-gamer culture. The former project was eventually taken by an embittered Landeros to a new company he founded just to publish it, whereupon it predictably flopped; the latter never got released at all. Trilobyte was officially wound up in January of 1999. “In the end, I never outran the shadow of The 7th Guest,” wrote Devine in a final email to his staff. “Mean old Stauf casts his long and bony shadow across this valley, and Trilobyte will always be remembered for those games and none other.”

In the aftermath, Devine continued his career in the games industry as an employee rather than an entrepreneur, working on popular blockbusters like Quake III, Doom 3, and Age of Empires III. (Good things, it seems, come to him in threes.) Landeros intermittently tried to get more of his quixotic interactive movies off the ground, whilst working as a graphic designer for the Web and other mediums. He’s become the keeper of the 7th Guest flame, for whatever that is still worth. In 2019, he launched a remastered 25th anniversary edition of the game, but it was greeted with lukewarm reviews and little enthusiasm from players. It seems that even nostalgia struggles to overcome the game’s manifest deficiencies.

The temptation to compare The 7th Guest to Myst, its more long-lived successor in the role of CD-ROM showcase for the masses, is all but irresistible. One might say that The 7th Guest really was all the things that Myst was so often accused of being: shallow, unfair, a tech demo masquerading as a game. Likewise, a comparison of the two games’ respective creators does Devine and Landeros no favors. The Miller brothers of Cyan Productions, the makers of Myst, took their fame and fortune with level-headed humility. Combined with their more serious attitude toward game design as a craft, this allowed them to weather the vicissitudes of fortune — albeit not without a few bumps along the way, to be sure! — and emerge with their signature franchise still intact. Devine and Landeros, alas, cannot make the same claim.

And yet I do want to be careful about using Myst as a cudgel with which to beat The 7th Guest. Unlike so many bad games, it wasn’t made for cynical reasons. On the contrary: all indications are that Devine and Landeros made it for all the right reasons, driven by a real, earnest passion to do something important, something groundbreaking. If the results largely serve today as an illustration of why static video clips strung together, whether they were created in a 3D modeler or filmed in front of live actors, are an unstable foundation on which to build a compelling game, the fact remains that we need examples of what doesn’t work as well as what does. And if the results look appallingly amateurish today on strictly aesthetic terms, they shouldn’t obscure the importance of The 7th Guest in the history of gaming. As gaming historians Magnus Anderson and Rebecca Levene put it, “The 7th Guest wasn’t anywhere near the league of professional film-making, but it moved games into the same sphere — a non-gamer could look at The 7th Guest and understand it, even if they were barely impressed.”

A year before Myst took the Wintel world by storm, The 7th Guest drove the first substantial wave of CD-ROM uptake, doing more than any other single product to turn 1993 into the long-awaited Year of CD-ROM. It’s been claimed that sales of CD-ROM drives jumped by 300 percent within weeks of its release. Indeed, The 7th Guest and CD-ROM in general became virtually synonymous for a time in the minds of consumers. And the game drove sales of SVGA cards to an equal degree; The 7th Guest was in fact the very first prominent game to demand more than everyday VGA graphics. Likewise, it undoubtedly prompted many a soul to take the plunge on a whole new 80486- or Pentium-based wundercomputer. And it also prompted the sale of countless CD-quality 16-bit sound cards. Thanks to The 7th Guest‘s immense success, game designers after 1993 had a far broader technological canvas on which to paint than they had before that year. And some of the things they painted there were beautiful and rich and immersive in all the ways that The 7th Guest tried to be, but couldn’t quite manage. While I heartily and unapologetically hate it as a game, I do love the new worlds of possibility it opened.

(Sources: the books La Saga des Jeux Vidéo by Daniel Ichbiah, Grand Thieves and Tomb Raiders: How British Video Games Conquered the World by Magnus Anderson and Rebecca Levene, and The 7th Guest: The Official Strategy Guide by Rusel DeMaria; Computer Gaming World of December 1990, May 1991, November 1992, October 1994, November 1994, June 1995, November 1998, December 1999, and July 2004; Electronic Entertainment of June 1994 and August 1995; Game Players PC Entertainment Vol. 5 No. 5; InterActivity of February 1996; Retro Gamer 85, 108, 122, and 123; Video Games and Computer Entertainment of August 1993; Zero of May 1992; Run 1986 Special Issue; Compute!’s Gazette of April 1985 and September 1986; ZX Computing of April 1986; Home Computing Weekly of July 19 1983; Popular Computing Weekly of May 26 1983; Crash of January 1985; Computer Gamer of December 1985 and February 1986; Origin Systems’s internal newslatter Point of Origin dated January 17 1992. Online sources include Geoff Keighly’s lengthy history of Trilobyte for GameSpot, John-Gabriel Adkins’s “Two Histories of Myst,” and “Jeremiah Nunan – An Irish Success Story” at the Jacksonville Review.

The 25th anniversary edition of The 7th Guest is available for purchase at GOG.com, as is the sequel The 11th Hour.)

 

Tags: , ,

Lemmings 2: The Tribes

When the lads at DMA Design started making the original Lemmings, they envisioned that it would allow you to bestow about twenty different “skills” upon your charges. But as they continued working on the game, they threw more and more of the skills out, both to make the programming task simpler and to make the final product more playable. They finally ended up with just eight skills, the perfect number to neatly line up as buttons along the bottom of the screen. In the process of this ruthless culling, Lemmings became a classic study in doing more with less in game design: those eight skills, combined in all sorts of unexpected ways, were enough to take the player through 120 ever-more-challenging levels in the first Lemmings, then 100 more in the admittedly less satisfying pseudo-sequel/expansion pack Oh No! More Lemmings.

Yet when the time came to make the first full-fledged sequel, DMA resurrected some of their discarded skills. And then they added many, many more of them: Lemmings 2: The Tribes wound up with no less than 52 skills in all. For this reason not least, it’s often given short shrift by critics, who compare its baggy maximalism unfavorably with the first game’s elegant minimalism. To my mind, though, Lemmings 2 is almost a Platonic ideal of a sequel, building upon the genius of the original game in a way that’s truly challenging and gratifying to veterans. Granted, it isn’t the place you should start; by all means, begin with the classic original. When you’ve made it through those 120 levels, however, you’ll find 120 more here that are just as perplexing, frustrating, and delightful — and with even more variety to boot, courtesy of all those new skills.



The DMA Design that made Lemmings 2 was a changed entity in some ways. The company had grown in the wake of the first game’s enormous worldwide success, such that they had been forced to move out of their cozy digs above a baby store in the modest downtown of Dundee, Scotland, and into a more anonymous office in a business park on the outskirts of town. The core group that had created the first Lemmings — designer, programmer, and DMA founder David Jones; artists and level designers Mike Dailly and Gary Timmons; programmer and level designer Russell Kay — all remained on the job, but they were now joined by an additional troupe of talented newcomers.

Lemmings 2 also reflects changing times inside the games industry in ways that go beyond the size of its development team. Instead of 120 unrelated levels, there’s now a modicum of story holding things together. A lengthy introductory movie — which, in another telling sign of the times, fills more disk space than the game itself and required almost as many people to make — tells how the lemmings were separated into twelve tribes, all isolated from one another, at some point in the distant past. Now, the island (continent?) on which they live is facing an encroaching Darkness which will end all life there. Your task is to reunite the tribes, by guiding each of them through ten levels to reach the center of the island. Once all of the tribes have gathered there, they can reassemble a magical talisman, of which each tribe conveniently has one piece, and use it to summon a flying ark that will whisk them all to safety.

It’s not exactly an air-tight plot, but no matter; you’ll forget about it anyway as soon as the actual game begins. What’s really important are the other advantages of having twelve discrete progressions of ten levels instead of a single linear progression of 120. You can, you see, jump around among all these tribes at will. As David Jones said at the time of the game’s release, “We want to get away from ‘you complete a level or you don’t.'” When you get frustrated banging your head against a single stubborn level — and, this being a Lemmings game, you will get frustrated — you can just go work on another one for a while.

Rather than relying largely on the same set of graphics over the course of its levels, as the original does, each tribe in Lemmings 2 has its own audiovisual theme: there are beach-bum lemmings, Medieval lemmings, spooky lemmings, circus lemmings, alpine lemmings, astronaut lemmings, etc. In a tribute to the place where the game was born, there are even Scottish Highland lemmings (although Dundee is actually found in the less culturally distinctive — or culturally clichéd — Lowlands). And there’s even a “classic” tribe that reuses the original graphics; pulling it up feels a bit like coming home from an around-the-world tour.


Teaching Old Lemmings New Tricks

In this Beach level, a lemming uses the “kayak” skill to cross a body of water.

In this Medieval level, one lemming has become an “attractor”: a minstrel who entrances all the lemmings around him with his music, keeping them from marching onward. Meanwhile one of his colleagues is blazing a trail in front for the rest to eventually follow.

In this Shadow level, the lemming in front has become a “Fencer.” This allows him to dig out a path in front of himself at a slight upward angle. (Most of the skills in the game that at first seem bewilderingly esoteric actually do have fairly simple effects.)

In this Circus level, one lemming has become a “rock climber”: a sort of super-powered version of an ordinary climber, who can climb even a canted wall like this one.

In this Polar level, a lemming has become a “roper,” making a handy tightrope up and over the tree blocking the path.

In this Space level, we’ve made a “SuperLem” who flies in the direction of the mouse cursor.


Other pieces of plumbing help to make Lemmings 2 feel like a real, holistic game rather than a mere series of puzzles. The first game, as you may recall, gives you an arbitrary number of lemmings which begin each level and an arbitrary subset of them which must survive it; this latter number thus marks the difference between success and failure. In the sequel, though, each tribe starts its first level with 60 lemmings, who are carried over through all of the levels that follow. Any lemmings lost on one level, in other words, don’t come back in the succeeding ones. It’s possible to limp to the final finish line with just one solitary survivor remaining — and, indeed, you quite probably will do exactly this with a few of the tribes the first time through. But it’s also possible to finish all but a few of the levels without killing any lemmings at all. At the end of each level and then again at the end of each tribe’s collection of levels, you’re awarded a bronze, silver, or gold star based on your performance. To wind up with gold at the end, you usually need to have kept every single one of the little fellows alive through all ten levels. There’s a certain thematic advantage in this: people often note how the hyper-cute original Lemmings is really one of the most violent videogames ever, requiring you to kill thousands and thousands of the cuties over its course. This objection no longer applies to Lemmings 2. But more importantly, it sets up an obsessive-compulsive-perfectionist loop. First you’ll just want to get through the levels — but then all those bronze and silver performances lurking in your past will start to grate, and pretty soon you’ll be trying to figure out how to do each level just that little bit more efficiently. The ultimate Lemmings 2 achievement, needless to say, is to collect gold stars across the board.

This tiered approach to success and failure might be seen as evidence of a kinder design sensibility, but in most other respects just the opposite is true; Lemmings 2 has the definite feel of a game for the hardcore. The first Lemmings does a remarkably good job of teaching you how to play it interactively over the course of its first twenty levels or so, introducing you one by one to each of its skills along with its potential uses and limitations. There’s nothing remotely comparable in Lemmings 2; it just throws you in at the deep end. While there is a gradual progression in difficulty within each tribe’s levels, the game as a whole is a lumpier affair, especially in the beginning. Each level gives you access to between one and eight of the 52 available skills, whilst evincing no interest whatsoever in showing you how to use any of them. There is some degree of thematic grouping when it comes to the skills: the Highland lemmings like to toss cabers; the beach lemmings are fond of swimming, kayaking, and surfing; the alpine lemmings often need to ski or skate. Nevertheless, the sheer number of new skills you’re expected to learn on the fly is intimidating even for a veteran of the first game. The closest Lemmings 2 comes to its predecessor’s training levels are a few free-form sandbox environments where you can choose your own palette of skills and have at it. But even here, your education can be a challenging one, coming down as it still does to trial and error.

Your first hours with the game can be particularly intimidating; as soon as you’ve learned how one group of skills works well enough to finish one level, you’re confronted with a whole new palette of them on the next level. Even I, a huge fan of the first game, bounced off the second one quite a few times before I buckled down, started figuring out the skills, and, some time thereafter, started having fun.

Luckily, once you have put in the time to learn how the skills work, Lemmings 2 becomes very fun indeed, — every bit as rewarding as the first game, possibly even more so. Certainly its level design is every bit as good — better in fact, relying more on logic and less on dodgy edge cases in the game engine than do the infamously difficult final levels of the first Lemmings. Even the spiky difficulty curve isn’t all bad; it can be oddly soothing to start on a new tribe’s relatively straightforward early levels after being taxed to the upmost on another tribe’s last level. If the first Lemmings is mountain climbing as people imagine it to be — a single relentless, ever-steeper ascent to a dizzying peak — the second Lemmings has more in common with the reality of the sport: a set of more or less difficult stages separated by more or less comfortable base camps. While it’s at least as daunting in the end, it does offer more ebbs and flows along the way.

One might say, then, that Lemmings 2 is designed around a rather literal interpretation of the concept of a sequel. That is to say, it assumes that you’ve played its predecessor before you get to it, and are now ready for its added complexity. That’s bracing for anyone who fulfills that criterion. But in 1993, the year of Lemmings 2‘s release, its design philosophy had more negative than positive consequences for its own commercial arc and for that of the franchise to which it belonged.

The fact is that Lemmings 2‘s attitude toward its sequel status was out of joint with the way sequels had generally come to function by 1993. In a fast-changing industry that was fast attracting new players, the ideal sequel, at least in the eyes of most industry executives, was a game equally welcoming to both neophytes and veterans. Audiovisual standards were changing so rapidly that a game that was just a couple of years old could already look painfully dated. What new player with a shiny new computer wanted to play some ugly old thing just to earn a right to play the latest and greatest?

That said, Lemmings 2 actually didn’t look all that much better than its predecessor either, flashy opening movie aside. Part of this was down to DMA Design still using the 1985-vintage Commodore Amiga, which was still very popular as a gaming computer in Britain and other European countries, as their primary development platform, then porting the game to MS-DOS and various other more modern platforms. Staying loyal to the Amiga meant working within some fairly harsh restrictions, such as that of having no more than 32 colors on the screen at once, not to mention making the whole game compact enough to run entirely off floppy disk; hard drives, much less CD-ROM drives, were still not common among European Amiga owners. Shortly before the release of Lemmings 2, David Jones confessed to being “a little worried” about whether people would be willing to look beyond the unimpressive graphics and appreciate the innovations of the game itself. As it happened, he was right to be worried.

Lemmings and Oh No! More Lemmings sold in the millions across a bewildering range of platforms, from modern mainstream computers like the Apple Macintosh and Wintel machines to antique 8-bit computers like the Commodore 64 and Sinclair Spectrum, from handheld systems like the Nintendo Game Boy and Atari Lynx to living-room game consoles like the Sega Master System and the Nintendo Entertainment System. Lemmings 2, being a much more complex game under the hood as well as on the surface, wasn’t quite so amenable to being ported to just about any gadget with a CPU, even as its more off-putting initial character and its lack of new audiovisual flash did it no favors either. It was still widely ported and still became a solid success by any reasonable standard, mind you, but likely sold in the hundreds of thousands rather than the millions. All indications are that the first game and its semi-expansion pack continued to sell more copies than the second even after the latter’s release.

In the aftermath of this muted reception, the bloom slowly fell off the Lemmings rose, not only for the general public but also for DMA Design themselves. The franchise’s true jump-the-shark moment ironically came as part of an attempt to re-jigger the creatures to become media superstars beyond the realm of games. The Children’s Television Workshop, the creator of Sesame Street among other properties, was interested in moving the franchise onto television screens. In the course of these negotiations, they asked DMA to give the lemmings more differentiated personalities in the next game, to turn them from anonymous marchers, each just a few pixels across, into something more akin to individualized cartoon characters. Soon the next game was being envisioned as the first of a linked series of no less than four of them, each one detailing the further adventures of three of the tribes after their escape from the island at the end of Lemmings 2, each one ripe for trans-media adaptation by the Children’s Television Workshop. But the first game of this new generation, called The Lemmings Chronicles, just didn’t work. The attempt to cartoonify the franchise was cloying and clumsy, and the gameplay fell to pieces; unlike Lemmings 2, Lemmings Chronicles eminently deserves its underwhelming critical reputation. DMA insiders like Mike Dailly have since admitted that its was developed more out of obligation than enthusiasm: “We were all ready to move on.” When it performed even worse than its predecessor, the Children’s Television Workshop dropped out; all of its compromises had been for nothing.

Released just a year after Lemmings 2, Lemmings Chronicles marked the last game in the six-game contract that DMA Design had signed with their publisher Psygnosis what seemed like an eternity ago — in late 1987 to be more specific, when David Jones had first come to Psygnosis with his rather generic outer-space shoot-em-up Menace, giving no sign that he was capable of something as ingenious as Lemmings. Now, having well and truly demonstrated their ingenuity, DMA had little interest in re-upping; they were even willing to leave behind all of their intellectual property, which the contract Jones had signed gave to Psygnosis in perpetuity. In fact, they were more than ready to leave behind the cute-and-cuddly cartoon aesthetic of Lemmings and return to more laddish forms of gaming. The eventual result of that desire would be a second, more long-lasting worldwide phenomenon, known as Grand Theft Auto.

Meanwhile Sony, who had acquired Psygnosis in 1993, continued off and on to test the waters with new iterations of the franchise, but all of those attempts evinced the same vague sense of ennui that had doomed Lemmings Chronicles; none became hits. The last Lemmings game that wasn’t a remake appeared in 2000.

It’s interesting to ask whether DMA Design and Psygnosis could have managed the franchise better, thereby turning it into a permanent rather than a momentary icon of gaming, perhaps even one on a par with the likes of Super Mario and Sonic the Hedgehog; they certainly had the sales to compete head-to-head with those other videogame icons for a few years there in the early 1990s. The obvious objection is that Mario and Sonic were individualized characters, while DMA’s lemmings were little more than a handful of tropes moving in literal lockstep. Still, more has been done with less in the annals of media history. If everyone had approached Lemmings Chronicles with more enthusiasm and a modicum more writing and branding talent, maybe the story would have turned out differently.

Many speculate today that the franchise must inevitably see another revival at some point, what with 21st-century pop culture’s tendency to mine not just the A-list properties of the past, but increasingly its B- and C-listers as well, in the name of one generation’s nostalgia and another’s insatiable appetite for kitsch. Something tells me as well that we haven’t seen the last of Lemmings, but, as of this writing anyway, the revival still hasn’t arrived.

As matters currently stand, then, the brief-lived but frenzied craze for Lemmings has gone down in history, alongside contemporaries like Tetris and The Incredible Machine, as one more precursor of the casual revolution in gaming that was still to come, with its very different demographics and aesthetics. But in addition to that, it gave us two games that are brilliant in their own right, that remain as vexing but oh-so-rewarding as they were in their heyday. Long may they march on.

One other surviving tribute to Dundee’s second most successful gaming franchise is this little monument at the entrance to the city’s Seabraes Park, erected by local artist Alyson Conway in 2013. Lemmings and Grand Theft Auto… not bad for a city of only 150,000 souls.

(Sources: the book Grand Thieves and Tomb Raiders by Magnus Anderson and Rebecca Levene; Compute! of January 1992; Amiga Format of May 1993 and the special 1992 annual; Retro Gamer 39; The One of November 1993; Computer Gaming World of July 1993.

Lemmings 2 has never gotten a digital re-release. I therefore make it available for download here, packaged to be as easy as possible to get running under DOSBox on your modern computer.)

 
 

Tags: , , ,

The 68000 Wars, Part 6: The Unraveling

Commodore International’s roots are in manufacturing, not computing. They’re used to making and selling all kinds of things, from calculators to watches to office furniture. Computers just happen to be a profitable sideline they stumbled into. Commodore International isn’t really a computer company; they’re a company that happens to make computers. They have no grand vision of computing in the future; Commodore International merely wants to make products that sell. Right now, the products they’re set up to sell happen to be computers and video games. Next year, they might be bicycles or fax machines.

The top execs at Commodore International don’t really understand why so many people love the Amiga. You see, to them, it’s as if customers were falling in love with a dishwasher or a brand of paper towels. Why would anybody love a computer? It’s just a chunk of hardware that we sell, fer cryin’ out loud.

— Amiga columnist “The Bandito,” Amazing Computing, January 1994

Commodore has never been known for their ability to arouse public support. They have never been noted for their ability to turn lemons into lemonade. But, they have been known to take a bad situation and create a disaster. At least there is some satisfaction in noting their consistency.

— Don Hicks, managing editor, Amazing Computing, July 1994

In the summer of 1992, loyal users of the Commodore Amiga finally heard a piece of news they had been anxiously awaiting for half a decade. At long last, Commodore was about to release new Amiga models sporting a whole new set of custom chips. The new models would, in other words, finally do more than tinker at the edges of the aging technology which had been created by the employees of little Amiga, Incorporated between 1982 and 1985. It was all way overdue, but, as they say, better late than never.

The story of just how this new chipset managed to arrive so astonishingly late is a classic Commodore tale of managerial incompetence, neglect, and greed, against which the company’s overtaxed, understaffed engineering teams could only hope to fight a rearguard action.



Commodore’s management had not woken up to the need to significantly improve the Amiga until the summer of 1988, after much of its technological lead over its contemporaries running MS-DOS and MacOS had already been squandered. Nevertheless, the engineers began with high hopes for what they called the “Advanced Amiga Architecture,” or the “AAA” chipset. It was to push the machine’s maximum screen resolution from 640 X 400 to 1024 X 768, while pushing its palette of available colors from 4096 to 16.8 million. The blitter and other pieces of custom hardware which made the machine so adept at 2D animation would be retained and vastly improved, even as the current implementation of planar graphics would be joined by “chunky-graphics” modes which were more suitable for 3D animation. Further, the new chips would, at the user’s discretion, do away with the flicker-prone interlaced video signal that the current Amiga used for vertical resolutions above 200 pixels, which made the machine ideal for desktop-video applications but annoyed the heck out of anyone trying to do anything else with it. And it would now boast eight instead of four separate sound channels, each of which would now offer 16-bit resolution — i.e., the same quality as an audio CD.

All of this was to be ready to go in a new Amiga model by the end of 1990. Had Commodore been able to meet that time table and release said model at a reasonable price point, it would have marked almost as dramatic an advance over the current state of the art in multimedia personal computing as had the original Amiga 1000 back in 1985.

Sadly, though, no plan at Commodore could long survive contact with management’s fundamental cluelessness. By this point, the research-and-development budget had been slashed to barely half what it had been when the company’s only products were simple 8-bit computers like the Commodore 64. Often only one engineer at a time was assigned to each of the three core AAA chips, and said engineer was more often than not young and inexperienced, because who else would work 80-hour weeks at the salaries Commodore paid? Throw in a complete lack of day-to-day oversight or management coordination, and you had a recipe for endless wheel-spinning. AAA fell behind schedule, then fell further behind, then fell behind some more.

Some fifteen months after the AAA project had begun, Commodore started a second chip-set project, which they initially called “AA.” The designation was a baseball metaphor rather than an acronym; the “AAA” league in American baseball is the top division short of the major leagues, while the “AA” league is one rung further down. As the name would indicate, then, the AA chipset was envisioned as a more modest evolution of the Amiga’s architecture, an intermediate step between the original chipset and AAA. Like the latter, AA would offer 16.8 million colors — of which 256 could be onscreen at once without restrictions, more with some trade-offs —  but only at a maximum non-interlaced resolution of 640 X 480, or 800 X 600 in interlace mode. Meanwhile the current sound system would be left entirely alone. On paper, even these improvements moved the Amiga some distance beyond the existing Wintel VGA standard — but then again, that world of technology wasn’t standing still either. Much depended on getting the AA chips out quickly.

But “quick” was an adjective which seldom applied to Commodore. First planned for release on roughly the same time scale that had once been envisioned for the AAA chipset, AA too fell badly beyond schedule, not least because the tiny engineering team was now forced to split their energies between the two projects. It wasn’t until the fall of 1992 that AA, now renamed to the “Advanced Graphics Architecture,” or “AGA,” made its belated appearance. That is to say, the stopgap solution to the Amiga’s encroaching obsolescence arrived fully two years after the comprehensive solution to the problem ought to have shipped. Such was life at Commodore.

Rather than putting the Amiga out in front of the competition, AGA at this late date could only move it into a position of rough parity with the majority of the so-called “Super VGA” graphics cards which had become fairly commonplace in the Wintel world over the preceding year or so. And with graphics technology evolving quickly in the consumer space to meet the demands of CD-ROM and full-motion video, even the Amiga’s parity wouldn’t last for long. The Amiga, the erstwhile pioneer of multimedia computing, was now undeniably playing catch-up against the rest of the industry.

The Amiga 4000

AGA arrived inside two new models which evoked immediate memories of the Amiga 2000 and 500 from 1987, the most successful products in the platform’s history. Like the old Amiga 2000, the new Amiga 4000 was the “professional” machine, shipping in a big case full of expansion slots, with 4 MB of standard memory, a large hard drive, and a 68040 processor running at 25 MHz, all for a street price of around $2600. Like the old Amiga 500, the Amiga 1200 was the “home” model, shipping in an all-in-one-case form factor without room for internal expansion, with 2 MB of standard memory, a floppy drive only, and a 68020 processor running at 14 MHz, all for a price of about $600.

The two models were concrete manifestations of what a geographically bifurcated computing platform the Amiga had become by the early 1990s. In effect, the Amiga 4000 was to be the new face of Amiga computing in North America; ditto the Amiga 1200 in Europe. Commodore would make only scattered, desultory attempts to sell each model outside of its natural market.



Although the Amiga 500 had once enjoyed some measure of success in the United States as a games machine and general-purpose home computer, those days were long gone by 1992. That year, MS-DOS and Windows accounted for 87 percent of all American computer-game sales and the Macintosh for 9 percent, while the Amiga was lumped rudely into the 4 percent labeled simply “other.” Small wonder that very few American games publishers still gave any consideration to the platform at all; what games were still available for the Amiga in North America must usually be acquired by mail order, often as pricey imports from foreign climes. Then, too, most of the other areas where the Amiga had once been a player, and as often as not a pioneer — computer-based art and animation, 3D modeling, music production, etc. — had also fallen by the wayside, with most of the slack for such artsy endeavors being picked up by the Macintosh.

The story of Eric Graham was depressingly typical of the trend. Back in 1986, Graham had created a stunning ray-traced 3D animation called The Juggler on the Amiga; it became a staple of shop windows, filling at least some of the gap left by Commodore’s inept marketing. User demand had then led him to create Sculpt 3D, one of the first two practical 3D modeling applications for a consumer-class personal computer, and release it through the publisher Byte by Byte in mid-1987. (The other claimant to the status of absolute first of the breed ran on the Amiga as well; it was called Videoscape 3D, and was released virtually simultaneously with Sculpt 3D). But by 1989 the latest Macintosh models had also become powerful enough to support Graham’s software. Therefore Byte by Byte and Graham decided to jump to that platform, which already boasted a much larger user base who tended to be willing to pay higher prices for their software. Orphaned on the Amiga, the Sculpt 3D line continued on the Mac until 1996. Thanks to it and many other products, the Mac took over the lead in the burgeoning field of 3D modeling. And as went 3D modeling, so went a dozen other arenas of digital creativity.

The one place where the Amiga’s toehold did prove unshakeable was desktop video, where its otherwise loathed interlaced graphics modes were loved for the way they let the machine sync up with the analog video-production equipment typical of the time: televisions, VCRs, camcorders, etc. From the very beginning, both professionals and a fair number of dedicated amateurs used Amigas for titling, special effects, color correction, fades and wipes, and other forms of post-production work. Amigas were used by countless television stations to display programming information and do titling overlays, and found their way onto the sets of such television and film productions as Amazing Stories, Max Headroom, and Robocop 2. Even as the Amiga was fading in many other areas, video production on the platform got an enormous boost in December of 1990, when an innovative little Kansan company called NewTek released the Video Toaster, a combination of hardware and software which NewTek advertised, with less hyperbole than you might imagine, as an “all-in-one broadcast studio in a box” — just add one Amiga. Now the Amiga’s production credits got more impressive still: Babylon 5, seaQuest DSV, Quantum Leap, Jurassic Park, sports-arena Jumbotrons all over the country. Amiga models dating from the 1980s would remain fixtures in countless local television stations until well after the millennium, when the transition from analog to digital transmission finally forced their retirement.

Ironically, this whole usage scenario stemmed from what was essentially an accidental artifact of the Amiga’s design; Jay Miner, the original machine’s lead hardware designer, had envisioned its ability to mix and match with other video sources not as a means of inexpensive video post-production but rather as a way of overlaying interactive game graphics onto the output from an attached laser-disc accessory, a technique that was briefly en vogue in videogame arcades. Nonetheless, the capability was truly a godsend, the only thing keeping the platform alive at all in North America.

On the other hand, though, it was hard not to lament a straitening of the platform’s old spirit of expansive, experimental creativity across many fields. As far as the market was concerned, the Amiga was steadily morphing from a general-purpose computer into a piece of niche technology for a vertical market. By the early 1990s, most of the remaining North American Amiga magazines had become all but indistinguishable from any other dryly technical trade journal serving a rigidly specialized readership. In a telling sign of the times, it was almost universally agreed that early sales of the Amiga 4000, which were rather disappointing even by Commodore’s recent standards, were hampered by the fact that it initially didn’t work with the Video Toaster. (An updated “Video Toaster 4000” wouldn’t finally arrive until a year after the Amiga 4000 itself.) Many users now considered the Amiga little more than a necessary piece of plumbing for the Video Toaster. NewTek and others sold turnkey systems that barely even mentioned the name “Commodore Amiga.” Some store owners whispered that they could actually sell a lot more Video Toaster systems that way. After all, Commodore was known to most of their customers only as the company that had made those “toy computers” back in the 1980s; in the world of professional video and film production, that sort of name recognition was worth less than no name recognition at all.

In Europe, meanwhile, the nature of the baggage that came attached to the Commodore name perhaps wasn’t all that different in the broad strokes, but it did carry more positive overtones. In sheer number of units sold, the Amiga had always been vastly more successful in Europe than in North America, and that trend accelerated dramatically in the 1990s. Across the pond, it enjoyed all the mass-market acceptance it lacked in its home country. It was particularly dominant in Britain and West Germany, two of the three biggest economies in Europe. Here, the Amiga was nothing more nor less than a great games machine. The sturdy all-in-one-case design of the Amiga 500 and, now, the Amiga 1200 placed it on a continuum with the likes of the Sinclair Spectrum and the Commodore 64. There was a lot more money to be made selling computers to millions of eager European teenagers than to thousands of sober American professionals, whatever the wildly different price tags of the individual machines. Europe was accounting for as much as 88 percent of Commodore’s annual revenues by the early 1990s.

And yet here as well the picture was less rosy than it might first appear. While Commodore sold almost 2 million Amigas in Europe during 1991 and 1992, sales were trending in the wrong direction by the end of that period. Nintendo and Sega were now moving into Europe with their newest console systems, and Microsoft Windows as well was fast gaining traction. Thus it comes as no surprise that the Amiga 1200, which first shipped in December of 1992, a few months after the Amiga 4000, was greeted with sighs of relief by Commodore’s European subsidiaries, followed by much nervous trepidation. Was the new model enough of an improvement to reverse the trend of declining sales and steal a march on the competition once again? Sega especially had now become a major player in Europe, selling videogame consoles which were both cheaper and easier to operate than an Amiga 1200, lacking as they did any pretense of being full-fledged computers.

If Commodore was facing a murky future on two continents, they could take some consolation in the fact that their old arch-rival Atari was, as usual, even worse off. The Atari Lynx, the handheld game console which Jack Tramiel had bilked Epyx out of for a song, was now the company’s one reasonably reliable source of revenue; it would sell almost 3 million units between 1989 and 1994. The Atari ST, the computer line which had caused such a stir when it had beat the original Amiga into stores back in 1985 but had been playing second fiddle ever since, offered up its swansong in 1992 in the form of the Falcon, a would-be powerhouse which, as always, lagged just a little bit behind the latest Amiga models’ capabilities. Even in Europe, where Atari, like Commodore, had a much stronger brand than in North America, the Falcon sold hardly at all. The whole ST line would soon be discontinued, leaving it unclear how Atari intended to survive after the Lynx faded into history. Already in 1992, their annual sales fell to $127 million — barely a seventh those of Commodore.

Still, everything was in flux, and it was an open question whether Commodore could continue to sell Amigas in Europe at the pace to which they had become accustomed. One persistent question dogging the Amiga 1200 was that of compatibility. Although the new chipset was designed to be as compatible as possible with the old, in practice many of the most popular old Amiga games didn’t work right on the new machine. This reality could give pause to any potential upgrader with a substantial library of existing games and no desire to keep two Amigas around the house. If your favorite old games weren’t going to work on the new machine anyway, why not try something completely different, like those much more robust and professional-looking Windows computers the local stores had just started selling, or one of those new-fangled videogame consoles?



The compatibility problems were emblematic of the way that the Amiga, while certainly not an antique like the 8-bit generation of computers, wasn’t entirely modern either. MacOS and Windows isolated their software environment from changing hardware by not allowing applications to have direct access to the bare metal of the computer; everything had to be passed through the operating system, which in turn relied on the drivers provided by hardware manufacturers to ensure that the same program worked the same way on a multitude of configurations. AmigaOS provided the same services, and its technical manuals as well asked applications to function this way — but, crucially, it didn’t require that they do so. European game programmers in particular had a habit of using AmigaOS only as a bootstrap. Doing so was more efficient than doing things the “correct” way; most of the audiovisually striking games which made the Amiga’s reputation would have been simply impossible using “proper” programming techniques. Yet it created a brittle software ecosystem which was ill-suited for the long term. Already before 1992 Amiga gamers had had to contend with software that didn’t work on machines with more than 512 K of memory, or with a hard drive attached, or with or without a certain minor revision of the custom chips, or any number of other vagaries of configuration. With the advent of the AGA chipset, such problems really came home to roost.

An American Amiga 1200. In the context of American computing circa 1992, it looked like a bizarrely antediluvian gadget. “Real” computers just weren’t sold in this style of all-in-one case anymore, with peripherals dangling messily off the side. It looked like a chintzy toy to Americans, whatever the capabilities hidden inside. A telling detail: notice the two blank keys on the keyboard, which were stamped with characters only in some continental European markets that needed them. Rather than tool up to produce more than one physical keyboard layout, Commodore just let them sit there in all their pointlessness on the American machines. Can you imagine Apple or IBM, or any other reputable computer maker, doing this? To Americans, and to an increasing number of Europeans as well, the Amiga 1200 just seemed… cheap.

Back in 1985, AmigaOS had been the very first consumer-oriented operating system to boast full-fledged preemptive multitasking, something that neither MacOS nor Microsoft Windows could yet lay claim to even in 1992; they were still forced to rely on cooperative multitasking, which placed them at the mercy of individual applications’ willingness to voluntarily cede time to others. Yet the usefulness of AmigaOS’s multitasking was limited by its lack of memory protection. Thanks to this lack, any individual program on the system could write, intentionally or unintentionally, all over the memory allocated to another; system crashes were a sad fact of life for the Amiga power user. AmigaOS also lacked a virtual-memory system that would have allowed more applications to run than the physical memory could support. In these respects and others — most notably its graphical interface, which still evinced nothing like the usability of Windows, much less the smooth elegance of the Macintosh desktop — AmigaOS lagged behind its rivals.

It is true that MacOS, dating as it did from roughly the same period in the evolution of the personal computer, was struggling with similar issues: trying to implement some form of multitasking where none at all had existed originally, kludging in support for virtual memory and some form of rudimentary memory protection. The difference was that MacOS was evolving, however imperfectly. While AmigaOS 3.0, which debuted with the AGA machines, did offer some welcome improvements in terms of cosmetics, it did nothing to address the operating system’s core failings. It’s doubtful whether anyone in Commodore’s upper management even knew enough about computers to realize they existed.

This quality of having one foot in each of two different computing eras dogged the platform in yet more ways. The very architectural approach of the Amiga — that of an ultra-efficient machine built around a set of tightly coupled custom chips — had become passé as Wintel and to a large extent even the Macintosh had embraced modular architectures where almost everything could live on swappable cards, letting users mix and match capabilities and upgrade their machines piecemeal rather than all at once. One might even say that it was down almost as much to the Amiga’s architectural philosophy as it was to Commodore’s incompetence that the machine had had such a devil of a time getting itself upgraded.

And yet the problems involved in upgrading the custom chips were as nothing compared to the gravest of all the existential threats facing the Amiga. It was common knowledge in the industry by 1992 that Motorola was winding down further development of the 68000 line, the CPUs at the heart of all Amigas. Indeed, Apple, whose Macintosh used the same CPUs, had seen the writing on the wall as early as the beginning of 1990, and had started projects to port MacOS to other architectures, using software emulation as a way to retain compatibility with legacy applications. In 1991, they settled on the PowerPC, a CPU developed by an unlikely consortium of Apple, IBM, and Motorola, as the future of the Macintosh. The first of the so-called “Power Macs” would debut in March of 1994. The whole transition would come to constitute one of the more remarkable sleights of hand in all computing history; the emulation layer combined with the ported version of MacOS would work so seamlessly that many users would never fully grasp what was really happening at all.

But Commodore, alas, was in no position to follow suit, even had they had the foresight to realize what a ticking time bomb the end of the 68000 line truly was. AmigaOS’s more easygoing attitude toward the software it enabled meant that any transition must be fraught with far more pain for the user, and Commodore had nothing like the resources Apple had to throw at the problem in any case. But of course, the thoroughgoing, eternal incompetence of Commodore’s management prevented them from even seeing the problem, much less doing anything about it. While everyone was obliviously rearranging the deck chairs on the S.S. Amiga, it was barreling down on an iceberg as wide as the horizon. The reality was that the Amiga as a computing platform now had a built-in expiration date. After the 68060, the planned swansong of the 68000 family, was released by Motorola in 1994, the Amiga would literally have nowhere left to go.

As it was, though, Commodore’s financial collapse became the more immediate cause that brought about the end of the Amiga as a vital computing platform. So, we should have a look at what happened to drive Commodore from a profitable enterprise, still flirting with annual revenues of $1 billion, to bankruptcy and dissolution in the span of less than two years.



During the early months of 1993, an initial batch of encouraging reports, stating that the Amiga 1200 had been well-received in Europe, was overshadowed by an indelibly Commodorian tale of making lemons out of lemonade. It turned out that they had announced the Amiga 1200 too early, then shipped it too late and in too small quantities the previous Christmas. Consumers had chosen to forgo the Amiga 500 and 600 — the latter being a recently introduced ultra-low-end model — out of the not-unreasonable belief that the generation of Amiga technology these models represented would soon be hopelessly obsolete. Finding the new model unavailable, they’d bought nothing at all — or, more likely, bought something from a competitor like Sega instead. The result was a disastrous Christmas season: Commodore didn’t yet have the new computers everybody wanted, and couldn’t sell the old computers they did have, which they’d inexplicably manufactured and stockpiled as if they’d had no inkling that the Amiga 1200 was coming. They lost $21.9 million in the quarter that was traditionally their strongest by far.

Scant supplies of the Amiga 1200 continued to devastate overall Amiga sales well after Christmas, leaving one to wonder why on earth Commodore hadn’t tooled up to manufacture sufficient quantities of a machine they had hoped and believed would be a big hit, for once with some justification. In the first quarter of 1993, Commodore lost a whopping $177.6 million on sales of just $120.9 million, thanks to a massive write-down of their inventory of older Amiga models piling up in warehouses. Unit sales of Amigas dropped by 25 percent from the same quarter of the previous year; Amiga revenues dropped by 45 percent, thanks to deep price cuts instituted to try to move all those moribund 500s and 600s. Commodore’s share price plunged to around $2.75, down from $11 a year before, $20 the year before that. Wall Street estimated that the whole company was now worth only $30 million after its liabilities had been subtracted — a drop of one full order of magnitude in the span of a year.

If anything, Wall Street’s valuation was generous. Commodore was now dragging behind them $115.3 million in debt. In light of this, combined with their longstanding reputation for being ethically challenged in all sorts of ways, the credit agencies considered them to be too poor a risk for any new loans. Already they were desperately pursuing “debt restructuring” with their major lenders, promising them, with little to back it up, that the next Christmas season would make everything right as rain again. Such hopes looked even more unfounded in light of the fact that Commodore was now making deep cuts to their engineering and marketing staffs — i.e., the only people who might be able to get them out of this mess. Certainly the AAA chipset, still officially an ongoing project, looked farther away than ever now, two and a half years after it was supposed to have hit the streets.

The Amiga CD32

It was for these reasons that the announcement of the Amiga CD32, the last major product introduction in Commodore’s long and checkered history, came as such a surprise to everyone in mid-1993. CD32 was perhaps the first comprehensively, unadulteratedly smart product Commodore had released since the Amiga 2000 and 500 back in 1987. It was as clever a leveraging of their dwindling assets as anyone could ask for. Rather than another computer, it was a games console, built around a double-speed CD-ROM drive. Commodore had tried something similar before, with the CDTV unit of two and a half years earlier, only to watch it go down in flames. For once, though, they had learned from their mistakes.

CDTV had been a member of an amorphously defined class of CD-based multimedia appliances for the living room — see also the Philips CD-I, the 3DO console, and the Tandy VIS — which had all been or would soon become dismal failures. Upon seeing such gadgets demonstrated, consumers, unimpressed by ponderous encyclopedias that were harder to use and less complete than the ones already on their bookshelves, trip-planning software that was less intuitive than the atlases already in their glove boxes, and grainy film clips that made their VCRs look high-fidelity, all gave vent to the same plaintive question: “But what good is it really?” The only CD-based console which was doing well was, not coincidentally, the only one which could give a clear, unequivocal answer to this question. The Sega Genesis with CD add-on was good for playing videogames, full stop.

Commodore followed Sega’s lead with the CD32. It too looked, acted, and played like a videogame console — the most impressive one on the market, with specifications far outshining the Sega CD. Whereas Commodore had deliberately obscured the CDTV’s technological connection to the Amiga, they trumpeted it with the CD32, capitalizing on the name’s association with superb games. At heart, the CD32 was just a repackaged Amiga 1200, in the same way that the CDTV had been a repackaged Amiga 500. Yet this wasn’t a problem at all. For all that it was a little underwhelming in the world of computers, the AGA chipset was audiovisually superior to anything the console world could offer up, while the 32-bit 68020 that served as the CD32’s brain gave it much more raw horsepower. Meanwhile the fact that at heart it was just an Amiga in a new form factor gave it a huge leg up with publishers and developers; almost any given Amiga game could be ported to the CD32 in a week or two. Throw in a price tag of less than $400 (about $200 less than the going price of a “real” Amiga 1200, if you could find one), and, for the first time in years, Commodore had a thoroughly compelling new product, with a measure of natural appeal to people who weren’t already members of the Amiga cult. Thanks to the walled-garden model of software distribution that was the norm in the console world, Commodore stood to make money not only on every CD32 sold but also from a licensing fee of $3 on every individual game sold for the console. If the CD32 really took off, it could turn into one heck of a cash cow. If only Commodore could have released it six months earlier, or have managed to remain financially solvent for six months longer, it might even have saved them.

As it was, the CD32 made a noble last stand for a company that had long made ignobility its calling card. Released in September of 1993 in Europe, it generated some real excitement, thanks not least to a surprisingly large stable of launch titles, fruit of that ease of porting games from the “real” Amiga models. Commodore sold CD32s as fast as they could make them that Christmas — which was unfortunately nowhere near as fast as they might have liked, thanks to their current financial straits. Nevertheless, in those European stores where CD32s were on-hand to compete with the Sega CD, the former often outsold the latter by a margin of four to one. Over 50,000 CD32s were sold in December alone.

The Atari Jaguar. There was some mockery, perhaps justifiable, of its “Jetsons” design aesthetic.

Ironically, Atari’s last act took much the same form as Commodore’s. In November of 1993, following a horrific third quarter in which they had lost $17.6 million on sales of just $4.4 million, they released a game console of their own, called the Jaguar, in North America. In keeping with the tradition dating back to 1985, it was cheaper than Commodore’s take on the same concept — its street price was under $250 — but not quite as powerful, lacking a CD-ROM drive. Suffering from a poor selection of games, as well as reliability problems and outright hardware bugs, the Jaguar faced an uphill climb; Atari shipped less than 20,000 of them in 1993. Nevertheless, the Tramiel clan confidently predicted that they would sell 500,000 units in 1994, and at least some people bought into the hype, sending Atari’s stock soaring to almost $15 even as Commodore’s continued to plummet.

For the reality was, the rapid unraveling of all other facets of Commodore’s business had rendered the question of the CD32’s success moot. The remaining employees who worked at the sprawling campus in West Chester, Pennsylvania, purchased a decade before when the VIC-20 and 64 were flying off shelves and Jack “Business is War” Tramiel was stomping his rival home-computer makers into dust, felt like dwarfs wandering through the ancient ruins of giants. Once there had been more than 600 employees here; now there were about 50. There was 10,000 square feet of space per employee in a facility where it cost $8000 per day just to keep the lights on. You could wander for hours through the deserted warehouses, shuttered production lines, and empty research labs without seeing another living soul. Commodore was trying to lease some of it out for an attractive rent of $4 per square foot, but, as with with most of their computers, nobody seemed all that interested. The executive staff, not wanting the stigma of having gone down with the ship on their resumes, were starting to jump for shore. Commodore’s chief financial officer threw up his hands and quit in the summer of 1993; the company’s president followed in the fall.

Apart from the CD32, for which they lacked the resources to manufacture enough units to meet demand, virtually none of the hardware piled up in Commodore’s European warehouses was selling at all anymore. In the third quarter of 1993, they lost $9.7 million, followed by $8 million in the fourth quarter, on sales of just $70 million. After a second disastrous Christmas in a row, it could only be a question of time.

In a way, it was the small things rather than the eye-popping financial figures which drove the point home. For example, the April 1994 edition of the New York World of Commodore show, for years already a shadow of its old vibrant self, was cancelled entirely due to lack of interest. And the Army and Air Force Exchange, which served as a storefront to American military personnel at bases all over the world, kicked Commodore off its list of suppliers because they weren’t paying their bills. It’s by a thousand little cuts like this one, each representing another sales opportunity lost, that a consumer-electronics company dies. At the Winter Consumer Electronics Show in January of 1994, at which Commodore did manage a tepid presence, their own head of marketing told people straight out that the Amiga had no future as a general-purpose computer; Commodore’s only remaining prospects, he said, lay with the American vertical market of video production and the European mass market of videogame consoles. But they didn’t have the money to continue building the hardware these markets were demanding, and no bank was willing to lend them any.

The proverbial straw which broke the camel’s back was a dodgy third-party patent relating to a commonplace programming technique used to keep a mouse pointer separate from the rest of the screen. Commodore had failed to pay the patent fee for years, the patent holder eventually sued, and in April of 1994 the court levied an injunction preventing Commodore from doing any more business at all in the United States until they paid up. The sum in question was a relatively modest $2.5 million, but Commodore simply didn’t have the money to give.

On April 29, 1994, in a briefly matter-of-fact press release, Commodore announced that they were going out of business: “The company plans to transfer its assets to unidentified trustees for the benefit of its creditors. This is the initial phase of an orderly voluntary liquidation.” And just like that, a company which had dominated consumer computing in the United States and much of Europe for a good part of the previous decade and a half was no more. The business press and the American public showed barely a flicker of interest; most of them had assumed that Commodore was already long out of business. European gamers reacted with shock and panic — few had realized how bad things had gotten for Commodore — but there was nothing to be done.

Thus it was that Atari, despite being chronically ill for a much longer period of time, managed to outlive Commodore in the end. Still, this isn’t to say that their own situation at the time of Commodore’s collapse was a terribly good one. When the reality hit home that the Jaguar probably wasn’t going to be a sustainable gaming platform at all, much less sell 500,000 units in 1994 alone, their stock plunged back down to less than $1 per share. In the aftermath, Atari limped on as little more than a patent troll, surviving by extracting judgments from other videogame makers, most notably Sega, for infringing on dubious intellectual property dating back to the 1970s. This proved to be an ironically more profitable endeavor for them than that of actually selling computers or game consoles. On July 30, 1996, the Tramiels finally cashed out, agreeing to merge the remnants of their company with JT Storage, a maker of hard disks, who saw some lingering value in the trademarks and the patents. It was a liquidation in all but name; only three Atari employees transitioned to the “merged” entity, which continued under the same old name of JT Storage.

And so disappeared the storied name of Atari and that of Tramiel simultaneously from the technology industry. Even as the trade magazines were publishing eulogies to the former, few were sorry to see the latter go, what with their long history of lawsuits, dirty dealing, and abundant bad faith. Jack Tramiel had purchased Atari in 1984 out of the belief that creating another phenomenon like the Commodore 64 — or for that matter the Atari VCS — would be easy. But the twelve years that followed were destined always to remain a footnote to his one extraordinary success, a cautionary tale about the dangers of conflating lucky timing and tactical opportunism with long-term strategic genius.

Even so, the fact does remain that the Commodore 64 brought affordable computing to millions of people all over the world. For that every one of those millions owes Jack Tramiel, who died in 2012, a certain debt of gratitude. Perhaps the kindest thing we can do for him is to end his eulogy there.



The story of the Amiga after the death of Commodore is long, confusing, and largely if not entirely dispiriting; for all these reasons, I’d rather not dwell on it at length here. Its most positive aspect is the surprisingly long commercial half-life the platform enjoyed in Europe, over the course of which game developers still found a receptive if slowly dwindling market ready to buy their wares. The last glossy newsstand magazine devoted to the Amiga, the British Amiga Active, didn’t publish its last issue until the rather astonishingly late date of November of 2001.

The Amiga technology itself first passed into the hands of a German PC maker known as Escom, who actually started manufacturing new Amiga 1200s for a time. In 1996, however, Escom themselves went bankrupt. The American PC maker Gateway 2000 became the last major company to bother with the aging technology when they bought it at the Escom bankruptcy auction. Afterward, though, they apparently had second thoughts; they did nothing whatsoever with it before selling it onward at a loss. From there, it passed into other, even less sure hands, selling always at a discount. There are still various projects bearing the Amiga name today, and I suspect they will continue until the generation who fell in love with the platform in its heyday have all expired. But these are little more than hobbyist endeavors, selling their products in minuscule numbers to customer motivated more by their nostalgic attachment to the Amiga name than by any practical need. It’s far from clear what the idea of an “Amiga computer” should even mean in 2020.

When the hardcore of the Amiga hardcore aren’t dreaming quixotically of the platform’s world-conquering return, they’re picking through the rubble of the past, trying to figure out where it all went wrong. Among a long roll call of petty incompetents in Commodore’s executive suites, two clear super-villains emerge: Irving Gould, Commodore’s chairman since the mid-1960s, and Mehdi Ali, his final hand-picked chief executive. Their mismanagement in the latter days of the company was so egregious that some have put it down to evil genius rather than idiocy. The typical hypothesis says that these two realized at some point in the very early 1990s that Commodore’s days were likely numbered, and that they could get more out of the company for themselves by running it into the ground than they could by trying to keep it alive. I usually have little time for such conspiracy theories; as far as I’m concerned, a good rule of thumb for life in general is never to attribute to evil intent what can just as easily be chalked up to good old human stupidity. In this case, though, there’s some circumstantial evidence lending at least a bit of weight to the theory.

The first and perhaps most telling piece of evidence is the two men’s ridiculously exorbitant salaries, even as their company was collapsing around them. In 1993, Mehdi Ali took home $2 million, making him the fourth highest-paid executive in the entire technology sector. Irving Gould earned $1.75 million that year — seventh on the list. Why were these men paying themselves as if they ran a thriving company when the reality was so very much the opposite? One can’t help but suspect that Gould at least, who owned 19 percent of Commodore’s stock, was trying to offset his losses on the one field by raising his personal salary on the other.

And then there’s the way that Gould, an enormously rich man whose personal net worth was much higher than that of all of Commodore by the end, was so weirdly niggardly in helping his company out of its financial jam. While he did loan fully $17.4 million back to Commodore, the operative word here is indeed “loan”: he structured his cash injections to ensure that he would be first in line to get his money back if and when the company went bankrupt, and stopped throwing good money after bad as soon as the threshold of the collateral it could offer up in exchange was exceeded. One can’t help but wonder what might have become of the CD32 if he’d been willing to go all-in to try to turn it into a success.

Of course, this is all rank speculation, which will quickly become libelous if I continue much further down this road. Suffice to say that questionable ethics were always an indelible part of Commodore. Born in scandal, the company would quite likely have ended in scandal as well if anyone in authority had been bothered enough by its anticlimactic bankruptcy to look further. I’d love to see what a savvy financial journalist could make of Commodore’s history. But, alas, I have neither the skill nor the resources for such a project, and the story is of little interest to the mainstream journalists of today. The era is past, the bodies are buried, and there are newer and bigger outrages to fill our newspapers.



Instead, then, I’ll conclude with two brief eulogies to mark the end of the Amiga’s role in this ongoing history. Rather than eulogizing in my own words, I’m going to use those of a true Amiga zealot: the anonymous figure known as “The Bandito,” whose “Roomers” columns in the magazine Amazing Computing were filled with cogent insights and nitty-gritty financial details every month. (For that reason, they’ve been invaluable sources for this series of articles.)

Jay Miner, the gentle genius, in 1990. In interviews like the one to which this photo was attached, he always seemed a little befuddled by the praise and love which Amiga users lavished upon him.

First, to Jay Miner, the canonical “father of the Amiga,” who died of the kidney disease he had been battling for most of his life on June 20, 1994, at age 62. If a machine can reflect the personality of a man, the Amiga certainly reflected his:

Jay was not only the inventive genius who designed the custom chips behind the Atari 800 and the Amiga, he also designed many more electronic devices, including a new pacemaker that allows the user to set their own heart rate (which allows them to participate in strenuous activities once denied to them). Jay was not only a brilliant engineer, he was a kind, gentle, and unassuming man who won the hearts of Amiga fans everywhere he went. Jay was continually amazed and impressed at what people had done with his creations, and he loved more than anything to see the joy people obtained from the Amiga.

We love you, Jay, for all the gifts that you have given to us, and all the fruits of your genius that you have shared with us. Rest in peace.

And now a last word on the Amiga itself, from the very last “Roomers” column, written by someone who had been there from the beginning:

The Amiga has left an indelible mark on the history of computing. [It] stands as a shining example of excellent hardware design. Its capabilities foreshadowed the directions of the entire computer industry: thousands of colors, multiple screen resolutions, multitasking, high-quality sound, fast animation, video capability, and more. It was the beauty and elegance of the hardware that sold the Amiga to so many millions of people. The Amiga sold despite Commodore’s neglect, despite their bumbling and almost criminal marketing programs. Developers wrote brilliantly for this amazing piece of hardware, creating software that even amazed the creators of the hardware. The Amiga heralded the change that’s even now transforming the television industry, with inexpensive CGI and video editing making for a whole new type of television program.

Amiga game software also changed the face of entertainment software. Electronic Arts launched themselves headlong into 16-bit entertainment software with their Amiga software line, which helped propel them into the $500 million giant they are today. Cinemaware’s Defender of the Crown showed people what computer entertainment could look like: real pictures, not blocky collections of pixels. For a while, the Amiga was the entertainment-software machine to have.

In light of all these accomplishments, the story of the Amiga really isn’t the tragedy of missed opportunities and unrealized potential that it’s so often framed as. The very design that made it able to do so many incredible things at such an early date — its tightly coupled custom chips, its groundbreaking but lightweight operating system — made it hard for the platform to evolve in the same ways that the less imaginative, less efficient, but modular Wintel and MacOS architectures ultimately did. While it lasted, however, it gave the world a sneak preview of its future, inspiring thousands who would go on to do good work on other platforms. We are all more or less the heirs to the vision embodied in the original Amiga Lorraine, whether we ever used a real Amiga or not. The platform’s most long-lived and effective marketing slogan, “Only Amiga Makes it Possible,” is of course no longer true. It is true, though, that the Amiga made many things possible first. May it stand forever in the annals of computing history alongside the original Apple Macintosh as one of the two most visionary computers of its generation. For without these two computers — one of them, alas, more celebrated than the other — the digital world that we know today would be a very different place.

(Sources: the books Commodore: The Final Years by Brian Bagnall and my own The Future Was Here; Amazing Computing of November 1992, December 1992, January 1993, February 1993, March 1993, April 1993, May 1993, June 1993, July 1993, September 1993, October 1993, November 1993, December 1993, January 1994, February 1994, March 1994, April 1994, May 1994, June 1994, July 1994, August 1994, September 1994, October 1994, and February 1995; Byte of January 1993; Amiga User International of June 1988; Electronic Gaming Monthly of June 1995; Next Generation of December 1996. My thanks to Eric Graham for corresponding with me about The Juggler and Sculpt 3D years ago when I was writing my book on the Amiga.

Those wishing to read about the Commodore story from the perspective of the engineers in the trenches, who so often accomplished great things in less than ideal conditions, should turn to Brian Bagnall’s full “Commodore Trilogy”: A Company on the Edge, The Amiga Years, and The Final Years.)

 

Tags: , , ,

Myst (or, The Drawbacks to Success)

Robyn Miller, one half of the pair of brothers who created the adventure game known as Myst with their small studio Cyan, tells a story about its development that’s irresistible to a writer like me. When the game was nearly finished, he says, its publisher Brøderbund insisted that it be put through “focus-group testing” at their offices. Robyn and his brother Rand reluctantly agreed, and soon the first group of guinea pigs shuffled into Brøderbund’s conference room. Much to its creators’ dismay, they hated the game. But then, just as the Miller brothers were wondering whether they had wasted the past two years of their lives making it, the second group came in. Their reaction was the exact opposite: they loved the game.

So would it be forevermore. Myst would prove to be one of the most polarizing games in history, loved and hated in equal measure. Even today, everyone seems to have a strong opinion about it, whether they’ve actually played it or not.

Myst‘s admirers are numerous enough to have made it the best-selling single adventure game in history, as well as the best-selling 1990s computer game of any type in terms of physical units shifted at retail: over 6 million boxed copies sold between its release in 1993 and the dawn of the new millennium. In the years immediately after its release, it was trumpeted at every level of the mainstream press as the herald of a new, dawning age of maturity and aesthetic sophistication in games. Then, by the end of the decade, it was lamented as a symbol of what games might have become, if only the culture of gaming had chosen it rather than the near-simultaneously-released Doom as its model for the future. Whatever the merits of that argument, the hardcore Myst lovers remained numerous enough in later years to support five sequels, a series of novels, a tabletop role-playing game, and multiple remakes and remasters of the work which began it all. Their passion was such that, when Cyan gave up on an attempt to turn Myst into a massively-multiplayer game, the fans stepped in to set up their own servers and keep it alive themselves.

And yet, for all the love it’s inspired, the game’s detractors are if anything even more committed than its proponents. For a huge swath of gamers, Myst has become the poster child for a certain species of boring, minimally interactive snooze-fest created by people who have no business making games — and, runs the spoken or unspoken corollary, played by people who have no business playing them. Much of this vitriol comes from the crowd who hate any game that isn’t violent and visceral on principle.

But the more interesting and perhaps telling brand of hatred comes from self-acknowledged fans of the adventure-game genre. These folks were usually raised on the Sierra and LucasArts traditions of third-person adventures — games that were filled with other characters to interact with, objects to pick up and carry around and use to solve puzzles, and complicated plot arcs unfolding chapter by chapter. They have a decided aversion to the first-person, minimalist, deserted, austere Myst, sometimes going so far as to say that it isn’t really an adventure game at all. But, however they categorize it, they’re happy to credit it with all but killing the adventure genre dead by the end of the 1990s. Myst, so this narrative goes, prompted dozens of studios to abandon storytelling and characters in favor of yet more sterile, hermetically sealed worlds just like its. And when the people understandably rejected this airless vision, that was that for the adventure game writ large. Some of the hatred directed toward Myst by stalwart adventure fans — not only fans of third-person graphic adventures, but, going even further back, fans of text adventures — reaches an almost poetic fever pitch. A personal favorite of mine is the description deployed by Michael Bywater, who in previous lives was himself an author of textual interactive fiction. Myst, he says, is just “a post-hippie HyperCard stack with a rather good music loop.”

After listening to the cultural dialog — or shouting match! — which has so long surrounded Myst, one’s first encounter with the actual artifact that spurred it all can be more than a little anticlimactic. Seen strictly as a computer game, Myst is… okay. Maybe even pretty good. It strikes this critic at least as far from the best or worst game of its year, much less of its decade, still less of all gaming history. Its imagery is well-composited and occasionally striking, its sound and music design equally apt. The sense of desolate, immersive beauty it all conveys can be strangely affecting, and it’s married to puzzle-design instincts that are reasonable and fair. Myst‘s reputation in some quarters as impossible, illogical, or essentially unplayable is unearned; apart from some pixel hunts and perhaps the one extended maze, there’s little to really complain about on that front. On the contrary: there’s a definite logic to its mechanical puzzles, and figuring out how its machinery works through trial and error and careful note-taking, then putting your deductions into practice, is genuinely rewarding, assuming you enjoy that sort of thing.

At same time, though, there’s just not a whole lot of there there. Certainly there’s no deeper meaning to be found; Myst never tries to be about more than exploring a striking environment and solving intricate puzzles. “When we started, we wanted to make a [thematic] statement, but the project was so big and took so much effort that we didn’t have the energy or time to put much into that part of it,” admits Robyn Miller. “So, we decided to just make a neat world, a neat adventure, and say important things another time.” And indeed, a “neat world” and “neat adventure” are fine ways of describing Myst.

Depending on your preconceptions going in, actually playing Myst for the first time is like going to meet your savior or the antichrist, only to find a pleasant middle-aged fellow who offers to pour you a cup of tea. It’s at this point that the questions begin. Why does such an inoffensive game offend so many people? Why did such a quietly non-controversial game become such a magnet for controversy? And the biggest question of all: why did such a simple little game, made by five people using only off-the-shelf consumer software, become one of the most (in)famous money spinners in the history of the computer-games industry?

We may not be able to answers all of these whys to our complete satisfaction; much of the story of Myst surely comes down to sheer happenstance, to the proverbial butterfly flapping its wings somewhere on the other side of the world. But we can at least do a reasonably good job with the whats and hows of Myst. So, let’s consider now what brought Myst about and how it became the unlikely success it did. After that, we can return once again to its proponents and its detractors, and try to split the difference between Myst as gaming’s savior and Myst as gaming’s antichrist.


Rand Miller

Robyn Miller

If nothing else, the origin story of Myst is enough to make one believe in karma. As I wrote in an earlier article, the Miller brothers and their company Cyan came out of the creative explosion which followed Apple’s 1987 release of HyperCard, a unique Macintosh authoring system which let countless people just like them experiment for the first time with interactive multimedia and hypertext. Cyan’s first finished project was The Manhole. Published in November of 1988 by Mediagenic, it was a goal-less software toy aimed at children, a virtual fairy-tale world to explore. Six months later, Mediagenic added music and sound effects and released it on CD-ROM, marking the first entertainment product ever to appear on that medium. The next couple of years brought two more interactive explorations for children from Cyan, published on floppy disk and CD-ROM.

Even as these were being published, however, the wheels were gradually coming off of Mediagenic, thanks to a massive patent-infringement lawsuit they lost to the Dutch electronics giant Philips and a whole string of other poor decisions and unfortunate events. In February of 1991, a young bright spark named Bobby Kotick seized Mediagenic in a hostile takeover, reverting the company to its older name of Activision. By this point, the Miller brothers were getting tired of making whimsical children’s toys; they were itching to make a real game, with a goal and puzzles. But when they asked Activision’s new management for permission to do so, they were ordered to “keep doing what you’ve been doing.” Shortly thereafter, Kotick announced that he was taking Activision into Chapter 11 bankruptcy. After he did so, Activision simply stopped paying Cyan the royalties on which they depended. The Miller brothers were lost at sea, with no income stream and no relationships with any other publishers.

But at the last minute, they were thrown an unexpected lifeline. Lo and behold, the Japanese publisher Sunsoft came along offering to pay Cyan $265,000 to make a CD-ROM-based adult adventure game in the same general style as their children’s creations — i.e., exactly what the Miller brothers had recently asked Activision for permission to do. Sunsoft was convinced that there would be major potential for such a game on the upcoming generation of CD-ROM-based videogame consoles and multimedia set-top boxes for the living room — so convinced, in fact, that they were willing to fund the development of the game on the Macintosh and take on the job of porting it to these non-computer platforms themselves, all whilst signing over the rights to the computer version(s) to Cyan for free. The Miller brothers, reduced by this point to a diet of “rice and beans and government cheese,” as Robyn puts it, knew deliverance when they saw it. They couldn’t sign the contract fast enough. Meanwhile Activision had just lost out on the chance to release what would turn out to be one of the games of the decade.

But of course the folks at Cyan were as blissfully unaware of that future as those at Activision. They simply breathed sighs of relief and started making their game. In time, Cyan signed a contract with Brøderbund to release the computer versions of their game, starting with the Macintosh original.

Myst certainly didn’t begin as any conscious attempt to re-imagine the adventure-game form. Those who later insisted on seeing it in almost ideological terms, as a sort of artistic manifesto, were often shocked when they first met the Miller brothers in person. This pair of plain-spoken, baseball-cap-wearing country boys were anything but ideologues, much less stereotypical artistes. Instead they seemed a perfect match for the environs in which they worked: an unassuming two-story garage in Spokane, Washington, far from any centers of culture or technology. Their game’s unique personality actually stemmed from two random happenstances rather than any messianic fervor.

One of these was — to put it bluntly — their sheer ignorance. Working on the minority platform that was the Macintosh, specializing up to this point in idiosyncratic children’s software, the Miller brothers were oddly disengaged from the computer-games industry whose story I’ve been telling in so many other articles here. By their own account, they had literally never even seen any of the contemporary adventure games from companies like LucasArts and Sierra before making Myst. In fact, Robyn Miller says today that he had only played one computer game in his life to that point: Infocom’s ten-year-old Zork II. Rand Miller, being the older brother, the first mover behind their endeavors, and the more technically adept of the pair, was perhaps a bit more plugged-in, but only a bit.

The other circumstance which shaped Myst was the technology employed to create it. This statement is true of any game, but it becomes even more salient here because the technology in question was so different from that employed by other adventure creators. Myst is indeed simply a HyperCard stack — the “hippie-dippy” is in the eye of the beholder — gluing together pictures generated by the 3D modeler StrataVision. During the second half of its development, a third everyday Macintosh software package made its mark: Apple’s QuickTime video system, which allowed Myst‘s creators to insert snippets of themselves playing the roles of the people who previously visited the semi-ruined worlds you spend the game exploring. All of these tools are presentation-level tools, not conventional game-building ones. Seen in this light, it’s little surprise that so much of Myst is surface. At bottom, it’s a giant hypertext done in pictures, with very little in the way of systems of any sort behind it, much less any pretense of world simulation. You wander through its nodes, in some of which you can click on something, which causes some arbitrary event to happen. The one place where the production does interest itself in a state which exists behind its visuals is in the handful of mechanical devices found scattered over each of its landscapes, whose repair and/or manipulation form the basis of the puzzles that turn Myst into a game rather than an unusually immersive slideshow.

In making Myst, each brother fell into the role he was used to from Cyan’s children’s projects. The brothers together came up with the story and world design, then Robyn went off to do the art and music while Rand did the technical plumbing in HyperCard. One Chuck Carter helped Robyn on the art side and Rich Watson helped Rand on the programming side, while Chris Brandkamp produced the intriguing, evocative environmental soundscape by all sorts of improvised means: banging a wrench against the wall or blowing bubbles in a toilet bowl, then manipulating the samples to yield something appropriately other-worldly. And that was the entire team. It was a shoestring operation, amateurish in the best sense. The only thing that distinguished the boys at Cyan from a hundred thousand other hobbyists playing with the latest creative tools on their own Macs was the fact that Cyan had a contract to do so — and a commensurate quantity of real, raw talent, of course.

Ironically given that Myst was treated as such a cutting-edge product at the time of its release, in terms of design it’s something of a throwback — a fact that does become less surprising when one considers that its creators’ experience with adventure games stopped in the early 1980s. A raging debate had once taken place in adventure circles over whether the ideal protagonist should be a blank slate, imprintable by the player herself, or a fully-fleshed-out role for the player to inhabit. The verdict had largely come down on the side of the latter as games’ plots had grown more ambitious, but the whole discussion had passed the Miller brothers by.

So, with Myst we were back to the old “nameless, faceless adventurer” paradigm which Sierra and LucasArts had long since abandoned. Myst actively encourages you to think of it as yourself there in its world. The story begins when you open a mysterious book here on our world, whereupon you get sucked into an alternate dimension and find yourself standing on the dock of a deserted island. You soon learn that you’re following a trail first blazed by a father and his two sons, all of whom had the ability to hop about between dimensions — or “ages,” as the game calls them — and alter them to their will. Unfortunately, the father is now said to be dead, while the two brothers have each been trapped in a separate interdimensional limbo, each blaming the other for their father’s death. (These themes of sibling rivalry have caused much comment over the years, especially in light of the fact that each brother in the game is played by one of the real Miller brothers. But said real brothers have always insisted that there are no deeper meanings to be gleaned here…)

You can access four more worlds from the central island just as soon as you solve the requisite puzzles. In each of them, you must find a page of a magical book. Putting the pages together, along with a fifth page found on the central island, allows you to free the brother of your choice, or to do… something else, which actually leads to the best ending. This last-minute branch to an otherwise unmalleable story is a technique we see in a fair number of other adventure games wishing to make a claim to the status of genuinely interactive fictions. (In practice, of course, players of those games and Myst alike simply save before the final choice and check out all of the endings.)

For all its emphasis on visuals, Myst is designed much like a vintage text adventure in many ways. Even setting aside its explicit maze, its network of discrete, mostly empty locations resembles the map from an old-school text adventure, where navigation is half the challenge. Similarly, its complex environmental puzzles, where something done in one location may have an effect on the other side of the map, smacks of one of Infocom’s more cerebral, austere games, such as Zork III or Spellbreaker.

This is not to say that Myst is a conscious throwback; the nature of the puzzles, like so much else about the game, is as much determined by the Miller brothers’ ignorance of contemporary trends in adventure design as by the technical constraints under which they labored. Among the latter was the impossibility of even letting the player pick things up and carry them around to use elsewhere. Utterly unfazed, Rand Miller coined an aphorism: “Turn your problems into features.” Thus Myst‘s many vaguely steam-punky mechanical puzzles, all switches to throw and ponderous wheels to set in motion, are dictated as much by its designers’ inability to implement a player inventory as by their acknowledged love for Jules Verne.

And yet, whatever the technological determinism that spawned it, this style of puzzle design truly was a breath of fresh air for gamers who had grown tired of the “use this object on that hotspot” puzzles of Sierra and LucasArts. To their eternal credit, the Miller brothers took this aspect of the design very seriously, giving their puzzles far more thought than Sierra at least tended to do. They went into Myst with no experience designing puzzles, and their insecurity  about this aspect of their craft was perhaps their ironic saving grace. Before they even had a computer game to show people, they spent hours walking outsiders through their scenario Dungeons & Dragons-style, telling them what they saw and listening to how they tried to progress. And once they did have a working world on the computer, they spent more hours sitting behind players, watching what they did. Robyn Miller, asked in an interview shortly after the game’s release whether there was anything he “hated,” summed up thusly their commitment to consistent, logical puzzle design and world-building (in Myst, the two are largely one and the same):

Seriously, we hate stuff without integrity. Supposed “art” that lacks attention to detail. That bothers me a lot. Done by people who are forced into doing it or who are doing it for formula reasons and monetary reasons. It’s great to see something that has integrity. It makes you feel good. The opposite of that is something I dislike.

We tried to create something — a fantastic world — in a very realistic way. Creating a fantasy world in an unrealistic way is the worst type of fantasy. In Jurassic Park, the idea of dinosaurs coming to life in the twentieth century is great. But it works in that movie because they also made it believable. That’s how the idea and the execution of that idea mix to create a truly great experience.

Taken as a whole, Myst is a master class in designing around constraints. Plenty of games have been ruined by designers whose reach exceeded their core technology’s grasp. We can see this phenomenon as far back as the time of Scott Adams: his earliest text adventures were compact marvels, but quickly spiraled into insoluble incoherence when he started pushing beyond what his simplistic parsers and world models could realistically present. Myst, then, is an artwork of the possible. Managing inventory, with the need for a separate inventory screen and all the complexities of coding this portable object interacting on that other thing in the world, would have stretched HyperCard past the breaking point. So, it’s gone. Interactive conversations would have been similarly prohibitive with the technology at the Millers’ fingertips. So, they devised a clever dodge, showing the few characters that exist only as recordings, or through one-way screens where you can see them, but they can’t see (or hear) you; that way, a single QuickTime video clip is enough to do the trick. In paring things back so dramatically, the Millers wound up with an adventure game unlike any that had been seen before. Their problems really did become their game’s features.

For the most part, anyway. The networks of nodes and pre-rendered static views that constitute the worlds of Myst can be needlessly frustrating to navigate, thanks to the way that the views prioritize aesthetics over consistency; rotating your view in place sometimes turns you 90 degrees, sometimes 180 degrees, sometimes somewhere in between, according to what the designers believed would provide the most striking image. Orienting yourself and moving about the landscape can thus be a confusing process. One might complain as well that it’s a slow one, what with all the empty nodes which you must move through to get pretty much anywhere — often just to see if something you’ve done on one side of the map has had any effect on something on its other side. Again, a comparison with the twisty little passages of an old-school text adventure, filled with mostly empty rooms, does strike me as thoroughly apt.

On the other hand, a certain glaciality of pacing seems part and parcel of what Myst fundamentally is. This is not a game for the impatient. It’s rather targeted at two broad types of player: the aesthete, who will be content just to wander the landscape taking in the views, perhaps turning to a walkthrough to be able to see all of the worlds; and the dedicated puzzle solver, willing to pull out paper and pencil and really dig into the task of understanding how all this strange machinery hangs together. Both groups have expressed their love for Myst over the years, albeit in terms which could almost convince you they’re talking about two entirely separate games.



So much for Myst the artifact. What of Myst the cultural phenomenon?

The origins of the latter can be traced to the Miller brothers’ wise decision to take their game to Brøderbund. Brøderbund tended to publish fewer products per year than their peers at Electronic Arts, Sierra, or the lost and unlamented Mediagenic, but they were masterful curators, with a talent for spotting software which ordinary Americans might want to buy and then packaging and marketing it perfectly to reach them. (Their insistence on focus testing, so confusing to the Millers, is proof of their competence; it’s hard to imagine any other publisher of the time even thinking of such a thing.) Brøderbund published a string of products over the course of a decade or more which became more than just hits; they became cultural icons of their time, getting significant attention in the mainstream press in addition to the computer magazines: The Print Shop, Carmen Sandiego, Lode Runner, Prince of Persia, SimCity. And now Myst was about to become the capstone to a rather extraordinary decade, their most successful and iconic release of all.

Brøderbund first published the game on the Macintosh in September of 1993, where it was greeted with rave reviews. Not a lot of games originated on the Mac at all, so a new and compelling one was always a big event. Mac users tended to conceive of themselves as the sophisticates of the computer world, wearing their minority status as a badge of pride. Myst hit the mark beautifully here; it was the Mac-iest of Mac games. MacWorld magazine’s review is a rather hilarious example of a homer call. “It’s been polished until it shines,” wrote the magazine. Then, in the next paragraph: “We did encounter a couple of glitches and frozen screens.” Oh, well.

Helped along by press like this, Myst came out of the gates strong. By one report, it sold 200,000 copies on the Macintosh alone in its first six months. If correct or even close to correct, those numbers are extraordinary; they’re the numbers of a hit even on the gaming Mecca that was the Wintel world, much less on the Mac, with its vastly smaller user base.

Still, Brøderbund knew that Myst‘s real opportunity lay with those selfsame plebeian Wintel machines which most Mac users, the Miller brothers included, disdained. Just as soon as Cyan delivered the Mac version, Brøderbund set up an internal team — larger than the Cyan team which had made the game in the first place — to do the port as quickly as possible. Importantly, Myst was ported not to bare MS-DOS, where almost all “hardcore” games still resided, but to Windows, where the new demographics which Brøderbund hoped to attract spent all of their time. Luckily, the game’s slideshow visuals were possible even under Windows’s sluggish graphics libraries, and Apple had recently ported their QuickTime video system to Microsoft’s platform. The Windows version of Myst shipped in March of 1994.

And now Brøderbund’s marketing got going in earnest, pushing the game as the one showcase product which every purchaser of a new multimedia PC simply had to have. At the time, most CD-ROM based games also shipped in a less impressive floppy-disk-based version, with the latter often still outselling the former. But Brøderbund and Cyan made the brave choice not to attempt a floppy-disk version at all. The gamble paid off beautifully, furthering the carefully cultivated aspirational quality which already clung to Myst, now billed as the game which simply couldn’t be done on floppy disk. Brøderbund’s lush advertisements had a refined, adult air about them which made them stand out from the dragons, spaceships, and scantily-clad babes that constituted the usual motifs of game advertising. As the crowning touch, Brøderbund devised a slick tagline: Myst was “the surrealistic adventure that will become your world.” The Miller brothers scoffed at this piece of marketing-speak — until they saw how Myst was flying off the shelves in the wake of it.

So, through a combination of lucky timing and precision marketing, Myst blew up huge. I say this not to diminish its merits as a puzzle-solving adventure game, which are substantial, but simply because I don’t believe those merits were terribly relevant to the vast majority of people who purchased it. A parallel can be drawn with Infocom’s game of Zork, which similarly surfed a techno-cultural wave a decade before Myst. It was on the scene just as home computers were first being promoted in the American media as the logical, more permanent successors to the videogame-console fad. For a time, Zork, with its ability to parse pseudo-natural-English sentences, was seen by computer salespeople as the best overall demonstration of what a computer could do; they therefore showed it to their customers as a matter of course. And so, when countless new computer systems went home with their new owners, there was also a copy of Zork in the bag. The result was Infocom’s best-selling game of all time, to the tune of almost 400,000 copies sold.

Myst now played the same role in a new home-computer boom. The difference was that, while the first boom had fizzled rather quickly when people realized of what limited practical utility those early machines actually were, this second boom would be a far more sustained affair. In fact, it would become the most sustained boom in the history of the consumer PC, stretching from approximately 1993 right through the balance of the decade, with every year breaking the sales records set by the previous one. The implications for Myst, which arrived just as the boom was beginning, were titanic. Even long after it ceased to be particularly cutting-edge, it continued to be regarded as an essential accessory for every PC, to be tossed into the bags carried home from computer stores by people who would never buy another game.

Myst had already established its status by the time the hype over the World Wide Web and Windows 95 really lit a fire under computer sales in 1995. It passed the 1 million copy mark in the spring of that year. By the same point, a quickie “strategy guide” published by Prima, ideal for the many players who just wanted to take in its sights without worrying about its puzzles, had passed an extraordinary 300,000 copies sold — thus making its co-authors, who’d spent all of three weeks working on it, the two luckiest walkthrough authors in history. Defying all of the games industry’s usual logic, which dictated that titles sold in big numbers for only a few months before fizzling out, Myst‘s sales just kept accelerating from there. It sold 850,000 copies in 1996 in the United States alone, then another 870,000 copies in 1997. Only in 1998 did it finally begin to flag, posting domestic sales of just 540,000 copies. Fortunately, the European market for multimedia PCs, which lagged a few years behind the American one, was now also burning bright, opening up whole new frontiers for Myst. Its total retail sales topped 6 million by 2000, at least 2 million of them outside of North America. Still more copies — it’s impossible to say how many — had shipped as pack-in bonuses with multimedia upgrade kits and the like. Meanwhile, under the terms of Sunsoft’s original agreement with Cyan, it was also ported by the former to the Sega Saturn, Atari Jaguar, 3DO, and CD-I living-room consoles. Myst was so successful that another publisher came out with an elaborate parody of it as a full-fledged computer game in its own right, under the indelible title of Pyst. Considering that it featured the popular sitcom star John Goodman, Pyst must have cost far more to make than the shoestring production it mocked.

As we look at the staggering scale of Myst‘s success, we can’t avoid returning to that vexing question of why it all should have come to be. Yes, Brøderbund’s marketing campaign was brilliant, but there must be more to it than that. Certainly we’re far from the first to wonder about it all. As early as December of 1994, Newsweek magazine noted that “in the gimmick-dominated world of computer games, Myst should be the equivalent of an art film, destined to gather critical acclaim and then dust on the shelves.” So why was it selling better than guaranteed crowd-pleasers with names like Star Wars on their boxes?

It’s not that it’s that difficult to pinpoint some of the other reasons why Myst should have been reasonably successful. It was a good-looking game that took full advantage of CD-ROM, at a time when many computer users — non-gamers almost as much as gamers — were eager for such things to demonstrate the power of their new multimedia wundermachines. And its distribution medium undoubtedly helped its sales in another way: in this time before CD burners became commonplace, it was immune to the piracy that many publishers claimed was costing them at least half their sales of floppy-disk-based games.

Likewise, a possible explanation for Myst‘s longevity after it was no longer so cutting-edge might be the specific technological and aesthetic choices made by the Miller brothers. Many other products of the first gush of the CD-ROM revolution came to look painfully, irredeemably tacky just a couple of years after they had dazzled, thanks to their reliance on grainy video clips of terrible actors chewing up green-screened scenery. While Myst did make some use of this type of “full-motion video,” it was much more restrained in this respect than many of its competitors. As a result, it aged much better. By the end of the 1990s, its graphics resolution and color count might have been a bit lower than those of the latest games, and it might not have been quite as stunning at first glance as it once had been, but it remained an elegant, visually-appealing experience on the whole.

Yet even these proximate causes don’t come close to providing a full explanation of why this art film in game form sold like a blockbuster. There are plenty of other games of equal or even greater overall merit to which they apply equally well, but none of them sold in excess of 6 million copies. Perhaps all we can do in the end is chalk it up to the inexplicable vagaries of chance. Computer sellers and buyers, it seems, needed a go-to game to show what was possible when CD-ROM was combined with decent graphics and sound cards. Myst was lucky enough to become that game. Although its puzzles were complex, simply taking in its scenery was disarmingly simple, making it perfect for the role. The perfect product at the perfect time, perfectly marketed.

In a sense, Myst the phenomenon didn’t do that other MystMyst the actual artifact, the game we can still play today — any favors at all. The latter seems destined always to be judged in relation to the former, and destined always to be found lacking. Demanding that what is in reality a well-designed, aesthetically pleasing game live up to the earth-shaking standards implied by Myst‘s sales numbers is unfair on the face of it; it wasn’t the fault of the Miller brothers, humble craftsmen with the right attitude toward their work, that said work wound up selling 6 million copies. Nevertheless, we feel compelled to judge it, at least to some extent, with the knowledge of its commercial and cultural significance firmly in mind. And in this context especially, some of its detractors’ claims do have a ring of truth.

Arguably the truthiest of all of them is the oft-repeated old saw that no other game was bought by so many people and yet really, seriously played by so few of its purchasers. While such a hyperbolic claim is impossible to truly verify, there is a considerable amount of circumstantial evidence pointing in exactly that direction. The exceptional sales of the strategy guide are perhaps a wash; they can be as easily ascribed to serious players wanting to really dig into the game as they can to casual purchasers just wanting to see all the pretty pictures on the CD-ROM. Other factors, however, are harder to dismiss. The fact is, Myst is hard by casual-game standards — so hard that Brøderbund included a blank pad of paper in the box for the purpose of keeping notes. If we believe that all or most of its buyers made serious use of that notepad, we have to ask where these millions of people interested in such a cerebral, austere, logical experience were before it materialized, and where they went thereafter. Even the Miller brothers themselves — hardly an unbiased jury — admit that by their best estimates no more than 50 percent of the people who bought Myst ever got beyond the starting island. Personally, I tend to suspect that the number is much lower than that.

Perhaps the most telling evidence for Myst as the game which everyone had but hardly anyone played is found in a comparison with one of its contemporaries: id Software’s Doom, the other decade-dominating blockbuster of 1993 (a game about which I’ll be writing much more in a future article). Doom indisputably was played, and played extensively. While it wasn’t quite the first running-around-and-shooting-things-from-a-first-person-perspective game, it did became so popular that games of its type were codified as a new genre unto themselves. The first-person shooters which followed Doom in the 1990s were among the most popular games of their era. Many of their titles are known to gamers today who weren’t yet born when they debuted: titles like Duke Nukem 3D, Quake, Half-Life, Unreal. Myst prompted just as many copycats, but these were markedly less popular and are markedly less remembered today: AMBER: Journeys Beyond, Zork Nemesis, Rama, Obsidian. Only Cyan’s own eventual sequel to Myst can be found among the decade’s bestsellers, and even it’s a definite case of diminishing commercial returns, despite being a rather brilliant game in its own right. In short, any game which sold as well as Myst, and which was seriously played by a proportionate number of people, ought to have left a bigger imprint on ludic culture than this one did.

But none of this should affect your decision about whether to play Myst today, assuming you haven’t yet gotten around to it. Stripped of all its weighty historical context, it’s a fine little adventure game if not an earth-shattering one, intriguing for anyone with the puzzle-solving gene, infuriating for anyone without it. You know what I mean… sort of a niche experience. One that just happened to sell 6 million copies.

(Sources: the books Myst: Prima’s Official Strategy Guide by Rick Barba and Rusel DeMaria, Myst & Riven: The World of the D’ni by Mark J.P. Wolf, and The Secret History of Mac Gaming by Richard Moss; Computer Gaming World of December 1993; MacWorld of March 1994; CD-ROM Today of Winter 1993. Online sources include “Two Histories of Myst” by John-Gabriel Adkins, Ars Technica‘s interview with Rand Miller, Ryan Miller’s postmortem of Myst at the 2013 Game Developers Conference, GameSpot‘s old piece on Myst as one of the “15 Most Influential Games of All Time,” and Greg Lindsay’s Salon column on Myst as a “dead end.” Michael Bywater’s colorful comments about Myst come from Peter Verdi’s now-defunct Magnetic Scrolls fan site, a dump of which Stefan Meier dug up for me from his hard drive several years ago. Thanks again, Stefan!

The “Masterpiece Edition” of Myst is available for purchase from GOG.com.)

 
99 Comments

Posted by on February 21, 2020 in Digital Antiquaria, Interactive Fiction

 

Tags: , , ,