RSS

Tag Archives: atari

Bob Stein and Voyager

Voyager tends to be overlooked in almost every survey because we didn’t really fit into anybody’s category. Librarians didn’t really pay much attention. The computer world never cared. Hollywood never really cared. We touched all these industries, but because we weren’t central to any of them and didn’t really ally with any of them in particular, we were in fact always an outlier.

— Bob Stein

In 1945, Vannevar Bush, an advisor to the American government on the subjects of engineering and technology, published his landmark essay “As We May Think,” which proposed using a hypothetical machine called the memex for navigating through an information space using trails of association. In the decades that followed, visionaries like Ted Nelson adapted Bush’s analog memex to digital computers, and researchers at Xerox PARC developed point-and-click interfaces that were ideal for the things that were by now being called “hypertexts.” Finally, in 1990, Tim Berners-Lee, a British computer scientist working at the European Organization for Nuclear Research in Geneva, created the World Wide Web, which moved hypertext onto the globe-spanning computer network known as the Internet. By the mid-1990s, a revolution in the way that all of us access and understand information was underway.

So runs a condensed history of the most life-changing invention in the realm of information transmission and retrieval since Gutenberg’s printing press. But, like any broad overview, it leaves out the many nooks and crannies where some of the interesting stories and artifacts can be found.

Tim Berners-Lee himself has told how the creation of the World Wide Web was more of a process of assembly than invention from whole cloth: “Most of the technologies involved in the Web had been designed already. I just had to put them together.” To wit: a very workable implementation of hypertext, eminently usable by everyday people, debuted on the Apple Macintosh in 1987, two and a half years before the first website went live. Over the course of that period of time, Apple’s HyperCard was used weekly by millions of people. When combined with the first CD-ROM drives, it gave those lucky people their first heady taste of what computing’s future was to hold for everyone. Even long after the Web had started to hum in earnest, locally hosted experiences, created in HyperCard and similar middleware environments like The Director, were able to combine hypertext with the sort of rich multimedia content that wouldn’t be practical over an Internet connection until after the millennium. This was the brief heyday of the CD as a publishing medium in its own right, a potential rival to the venerable paper-based book.

These CD-based hypertexts were different from the Web in some other, more fundamental or even philosophical ways than that of mere audiovisual fidelity. The Web was and is a hyper-social environment, where everyone links to everyone else — where, indeed, links and clicks are the very currency of the realm. This makes it an exciting, dynamic place to explore, but it also has its drawbacks, as our current struggles with online information bubbles and conscious disinformation campaigns illustrate all too well. Hypertextual, multimedia CD-ROMs, on the other hand, could offer closed but curated experiences, where a single strong authorial voice could be preserved. They were perhaps as close as we’ve ever come to non-fiction electronic books: not paper books which have simply been copied onto an electronic device, like ebooks on the Amazon Kindle and its ilk, but books which could not possibly exist on paper — books which supplement their text whenever necessary with sound and video, books consciously designed to be navigated by association. How strange and sad it is to think that they only existed during a relatively brief interstitial period in the history of information technology, when computers could already deliver hypertext and rich multimedia content but before the World Wide Web was widely enough available and fast enough to do everything we might ask of it.

The gold standard for electronic books on CD-ROM were the ones published by the Voyager Company. These productions ooze personality and quality, boasting innovative presentations and a touching faith in the intelligence of their readers. I would go so far as to say that there’s been no other collection of works quite like them in the entire history of electronic media. They stand out for me as some of my most exciting discoveries in all the years I’ve been writing these chronicles of our recent digital past for you. I’m delighted to bring you their story today.



The founder and head of Voyager was one Bob Stein. He was surely one of the most unlikely chief executives in the annals of American business, a man immortalized by Wired magazine as both “the most far-out publishing visionary in the new world” and “the least effective businessman alive.”

Stein was born in New York City in 1946, the son of a wealthy jewelry importer. His upbringing was, as he described it to me, “extremely privileged,” with all the best schools and opportunities that his parents’ money could buy. In high school, he imagined becoming an accountant or a lawyer. But he wound up going to Columbia University as a psychology undergraduate instead, and there he was swept up in the radical politics of the hippie generation. He found a home in the Revolutionary Communist Party, a group which hewed to the China of Chairman Mao in the internecine split that marked the international communist movement. Stein:

I was a revolutionary. I am not a revolutionary anymore, but, although my ideology has shifted, it hasn’t changed. I think we’re still many, many years away from making a judgment about the [Chinese] Cultural Revolution. Anything that is that broad, that encompasses a billion people over a ten-year period, is going to have so many facets to it. I will go to my grave saying that, from one perspective, the Cultural Revolution was the high point of humanity at the point when it happened. We’d never seen a society that was so good for so many of its people. That doesn’t mean it was good for everybody; intellectuals in particular suffered if they were not onboard with what was happening. And intellectuals are the ones who tell the story. So, a lot of the stories are told by intellectuals who didn’t do well during the Cultural Revolution. It was a hard time in China for a lot of people — but I don’t fault the Chinese for trying. Whether they failed is not as interesting to me as whether they tried.

Stein spent the late 1960s and most of the 1970s as a committed communist revolutionary, even as he was also earning a graduate degree in education from Harvard and working as a teacher and grass-roots community activist. Over the years, however, he grew more and more frustrated as the worldwide communist revolution he had been promised failed to materialize.

By the time I was in my early thirties, it became clear that revolution was much, much further away than I had thought when I signed up, as it were. So I made the extremely selfish decision to go do something else with my life. I did that because I could. With degrees from Columbia and Harvard, the world was my oyster. If a white guy like me wanted to come in from the cold, nobody looked askance. I remember walking down the street in New York with my daughter just after I left the Party, heading to some meeting I had set up with the president of CBS, whom I didn’t know, but I knew how to write a letter. She turned to me and said, “You know, you couldn’t be doing this transition if you didn’t have the background you have.”  And that was right. It wasn’t that I had any illusion that I was suddenly doing from the inside what I couldn’t do from the outside, it was that I was going to do something interesting and of value to humanity. I wasn’t making revolution anymore, but I would do things that had social value.

The question, of course, was just what those things should be. To the extent that he was aware of it at all, Stein had been unimpressed by the technological utopianism of organizations like Silicon Valley’s People’s Computer Company and Homebrew Computer Club. As a thoroughgoing Maoist, he had believed that society needed to be remade from top to bottom, and found such thinking as theirs naïve: “I didn’t think you could liberate humanity by doing cool shit on computers.”

Stein’s eureka moment came in the bathroom of the Revolutionary Communist Party’s Propaganda Headquarters in Chicago. (Yes, a place by that name really existed.) Someone had left a copy of BusinessWeek there, presumably to help the party faithful keep tabs on the enemy. In it was an article about MCA’s work on what would become known as the laser disc, the first form of optical media to hit the consumer market; each of the record-album-sized discs was capable of storing up to one hour of video and its accompanying audio on each of its sides. Unlike the later audio-only compact disc, the laser disc was an analog rather than digital storage medium, meaning it couldn’t practically be used to store forms of data other than still and moving images and audio. Nevertheless, it could be controlled by an attached computer and made to play back arbitrary snippets of same. “It just seemed cool to me,” says Stein.

Shortly afterward, he and his wife Aleen Stein left the Party and moved to Los Angeles, where he spent his afternoons in the library, trying to make a plan for his future, and his nights working as a waiter. The potential of random-access optical media continued to intrigue him, leading him in the end into a project for Encyclopedia Britannica.

I failed Physics for Poets; I’ve never been technically oriented. I read an article where the chief scientist for Encyclopedia Britannica talked about putting the entire encyclopedia on a disc. I didn’t realize he meant that you could just do a digital dump; I thought he meant you could put it on there in a way that was interesting. So I wrote a letter to Random House asking if I could buy the rights to the Random House encyclopedia, which I quite liked; it was much less stodgy than the Encyclopedia Britannica.

A friend of mine asked me what I was into these days, and I sent her a copy of the letter, forgetting that her father was on the board of Encyclopedia Britannica. A few weeks later, I got a call from Chuck Swanson, the president of Encyclopedia Britannica. “If you know so much,” he said, “why don’t you come and talk to us?” I didn’t know anything!

I found this guy at the University of Nebraska who had made a bunch of videodiscs for the CIA, and I convinced him to come with me to Chicago to provide the gravitas for the meeting. We did a demo for Chuck Swanson and Charlie Van Doren, who was the head of editorial; he was like a kid in a candy store, he fell in love with this shit. They hired me to go away for a year and write a paper.

Stein was fortunate in having made his pitch at just the right time. Traditional print publishers like Encyclopedia Britannica were just beginning to reckon with the impact that the nascent personal-computer revolution was likely to have on their businesses; at almost the same instant that Stein was having his meeting, no less staid a publishing entity than the Reader’s Digest Corporation was investing $3 million in The Source, one of the first nationwide telecommunications services for everyday computer users. In time, when such things proved to catch on slower than their most wide-eyed boosters had predicted, print publishing’s ardor for electronic media would cool again, or be channeled into such other outlets as the equally short-lived bookware boom in computer games.

Stein’s final paper, which he sent to Encyclopedia Britannica in November of 1981, bore the dizzyingly idealistic title of “Encyclopedia Britannica and the Intellectual Tools of the Future.” He still considers it the blueprint of everything he would do in later years. Indeed, the products which he imagined Encyclopedia Britannica publishing would have fit in very well with the later Voyager CD-ROM catalog: Great Moments in History: A Motion Picture Archive; Please Explain with Isaac Asimov; Space Exploration: Past, Present, and Future; Invention and Discovery: The History of Technology; Everyday Life in the American Past; Origins: Milestones in Archaeological Discovery; Computers: The Mystery Unraveled; A Cultural History of the United States; Britannica Goes to London; The Grand Canyon: A Work in Progress. He pleaded with the company to think in the long term. From the report:

The existing [laser-disc] market is so small that immediate returns can only be minimal. To put this another way, if only the next couple of years are considered, more money can be made by investing in bonds than in videodiscs. On the other hand, the application of a carefully designed program now can minimize risks and put Encyclopedia Britannica in a position to realize impressive profits in a few years when the market matures. Three years from now, when other publishers are scrambling to develop video programs, Encyclopedia Britannica will already have a reputation for excellence and enjoy wide sales as a result. Furthermore, the individual programs we have described are designed to be classics that would be sold for many years.

As we’ll see, this ethic of getting in on the ground floor right now, even though profits might be nonexistent for the nonce, would also become a core part of Voyager’s corporate DNA; it would be a company perpetually waiting for a fondly predicted future when the financial floodgates would open. Encyclopedia Britannica, however, wasn’t interested in chasing unicorns at this time. Stein was paid for his efforts and his report was filed away in a drawer somewhere, never to be heard of again.

But Stein himself no longer had any doubts about what sorts of socially valuable things he wanted to work on. His journey next took him to, of all places, the videogame giant Atari.

At the time, the first wave of videogame mania was in full swing in the United States, with Atari at its epicenter, thanks to their Atari VCS home console and their many hit standup-arcade games. They quite literally had more money than they knew what to do with, and were splashing it around in some surprising ways. One of these was the formation of a blue-sky research group that ranged well beyond games to concern itself with the future of computing and information technology in the abstract. Its star — in fact, the man with the title of Atari’s “Chief Scientist” — was Alan Kay, a famous name already among digital futurists: while at Xerox PARC during the previous decade, he had developed Smalltalk, an object- and network-oriented programming language whose syntax was simple enough for children to use, and had envisioned something he called the Dynabook, a handheld computing device that today smacks distinctly of the Apple iPad. Reading about the Dynabook in particular, Stein decided that Alan Kay was just the person he needed to talk to. Stein:

I screwed up my courage one day and contacted him, and he invited me to come meet him. Alan read the [Encyclopedia Britannica] paper — all 120 pages of it — while we were sitting together. He said, “This is great. This is just what I want to do. Come work with me.”

Stein was joining a rarefied collection of thinkers. About half of Kay’s group consisted of refugees from Xerox PARC, while the other half was drawn from the brightest lights at the Massachusetts Institute of Technology’s Architecture Machine Group, another hotbed of practical and theoretical innovation in computing.

Atari’s corporate offices were located in Silicon Valley, while both Stein and Kay lived in Los Angeles. In a telling testimony to the sheer amount of money at Atari’s disposal, they allowed the two to commute to work every day by airplane. Kay would sit next to Stein on the plane and “talk at me about two things I didn’t really know much about: music and computers. But I knew how to nod, so he thought I was understanding, and kept talking.”

Amidst this new breed of revolutionary dreamers, Stein found himself embracing the unwonted role of the practical man; he realized that he wanted to launch actual products rather than being content to lay the groundwork for the products of others, as most of his colleagues were. He took to flying regularly to New York, where he attempted to interest the executives at Warner Communications, Atari’s parent company, in the ideas being batted around on the other coast. But, while he secured many meetings, nothing tangible came out of them. After some eighteen months, he made the hard decision to move on. Once again, his timing was perfect: he left Atari just as the Great Videogame Crash was about to wreck their business. Less than a year later, Warner would sell the remnants of the company off in two chunks for pennies on the dollar, and Alan Kay’s research group would be no more.

Working independently now, Stein continued to try to interest someone — anyone — in pursuing the directions he had roughed out in his paper for Encyclopedia Britannica and refined at Atari. He got nowhere — until a spontaneous outburst set him on a new course: “At a boring meeting with RKO Home Video, I said off the top of my head, ‘How about selling me a very narrow right to Citizen Kane and King Kong? They said, ‘Sure.'” For the princely sum of $10,000, Stein walked away with the laser-disc rights to those two classic films.

Now all those hours spent hobnobbing with Warner executives paid off. Being idea-rich but cash-poor, Stein convinced a former Warner senior vice president named Roger Smith, who had been laid off in the wake of the Atari implosion, to put up his severance package as the capital for the venture. Thus was formed the Criterion Collection as a partnership between Bob and Aleen Stein and Roger Smith. The name is far better known today than that of Voyager — for, unlike Voyager, Criterion is still going as strong as ever.

The Criterion Citizen Kane and King Kong hit the market in November of 1984, causing quite the stir. For all that our main interest in this article is the later CD-ROMs of Voyager, it’s impossible to deny that, of all the projects Stein has been involved with, these early Criterion releases had the most direct and undeniable influence on the future of media consumption. Among many other innovations, Criterion took advantage of a heretofore little-remarked feature of the laser disc, the ability to give the viewer a choice of audio tracks to go with the video; this let them deploy the first-ever commentary track on the King Kong release, courtesy of film historian and preservationist Ron Haver. Stein describes how it came about:

At that time, and I don’t think it’s all that different today, transferring an old film to video meant sitting in a dark room (which cost hundreds of dollars per hour) making decisions about the correct color values for each individual shot. That was Ron’s job, and somewhat coincidentally, King Kong was his favorite film, and he kept us entertained by telling countless stories about the history of the film’s production. Someone said, “Hey, we’ve got extra sound tracks… why don’t we have Ron tell these stories while the film is playing?” Ron’s immediate reaction was “Are you kidding, NO WAY!”  The idea seemed too perfect to pass up, though, so I asked Ron if being stoned might help. He thought for a moment and said, “Hmmm! That might work.”  And so the next day we recorded Ron telling stories while the film played.


Criterion also included deleted scenes, making-of documentaries, and other materials in addition to the commentary. In doing so, they laid out what would become standard operating procedure for everyone selling movies for the home once the DVD era arrived twelve years later.

Still, Stein himself believes that the real bedrock of Criterion’s reputation was the quality of the video transfer itself. The Criterion name became a byword for quality, the brand of choice for serious aficionados of film — a status it would never relinquish amidst all the changes in media-consumption habits during the ensuing decades.

So, when Bob Stein dies, his obituary is almost certain to describe him first and foremost as a co-founder of the Criterion Collection, the propagator of a modest revolution in the way we all view, study, and interact with film. Yet Criterion was never the entirety of what Stein dreamed of doing, nor perhaps quite what he might have wished his principal legacy to be. Although he had an appreciation for film, he didn’t burn for it with the same passion evinced by Criterion’s most loyal customers. That description better fit his wife Aleen, who would come to do most of the practical, everyday work of managing Criterion. “I’m a book guy,” he says. “I love books.” The Criterion Collection made quality products of which he was justifiably proud, but it would also become a means to another end, the funding engine that let him pursue his elusive dream of true electronic books.

The company known as Voyager was formed already in 1985, the result of a series of negotiations, transactions, and fallings-out which followed the release of the first two Criterion laser discs. The shakeup began when Roger Smith left the venture, having found its free-wheeling hippie culture not to his taste, and announced his determination to take the Criterion name with him. This prompted the Steins to seek out a new partner in Janus Films, a hallowed name among cineastes, a decades-old art-film distributor responsible for importing countless international classics to American shores. Together they formed the Voyager Company — named after the pair of space probes — to continue what Criterion had begun. Soon, however, it emerged that Smith was willing to sell them back the Criterion name for a not-unreasonable price. So, the logic of branding dictated that the already established Criterion Collection continue as an imprint of the larger Voyager Company.

Stein’s dreams for electronic books remained on hold for a couple of years, while the Criterion Collection continued to build a fine reputation for itself. Then, in August of 1987, Apple premiered a new piece of software called HyperCard, a multimedia hypertext authoring tool that was to be given away free with every new Macintosh computer. Within weeks of HyperCard’s release, enterprising programmers had developed ways of using it to control an attached laser-disc player. This was the moment, says Stein, that truly changed everything.

I should note at this juncture that the idea of doing neat things with an ordinary personal computer and an attached laser-disc player was not really a new one in the broad strokes. As far back as 1980, the American Heart Association deployed a CPR-training course built around an Apple II and a laser-disc player. Other such setups were used for pilot training, for early-childhood education, and for high-school economics courses. In January of 1982, Creative Computing magazine published a type-in listing for what was billed as the world’s first laser-disc game, which required a degree of hardware-hacking aptitude and a copy of the laser-disc release of the 1977 movie Rollercoaster to get working. Eighteen months later, laser-disc technology reached arcades in the form of Dragon’s Lair, which was built around 22 minutes of original cartoon footage from the former Disney animator Don Bluth. When the Commodore Amiga personal computer shipped in 1985, it included the ability to overlay its graphics onto other video sources, a feature originally designed with interactive laser-disc applications in mind.

Nevertheless, the addition of HyperCard to the equation did make a major difference for people like Bob Stein. On the one hand, it made controlling the laser-disc player easier than ever before — easy enough even for an avowed non-programmer like him. And on the other hand, hypertexts rather than conventional computer programs were exactly what he had been wanting to create all these years.

Voyager’s first ever product for computers began with an extant laser disc published by the National Gallery of Art in Washington, D.C., which included still images of the entirety of the museum’s collection along with scattered video snippets. From this raw material, Stein and two colleagues created the National Gallery of Art Laserguide, a tool for navigating and exploring the collection in a multitude of ways. It caused a big stir at the Macworld Expo in January of 1988: “We became stars! It was fucking magic!” Shortly thereafter, Stein demonstrated it on the television show The Computer Chronicles.


Exciting as experiments like this one were, they couldn’t overcome the inherent drawbacks of the laser disc. As I noted earlier, it was an analog medium, capable of storing only still or moving images and audio. And, as can be seen in the segment above, using it in tandem with a computer meant dealing with two screens — one connected to the computer, the other to the laser-disc player itself. The obvious alternative was the compact disc, a format which, although originally developed for music, was digital rather than analog, and thus could be adapted to store any kind of data and to display it right on a computer’s monitor.

After a tortuously protracted evolution, CD-ROM was slowly sputtering to life as at least a potential proposition for the commercial marketplace. Microsoft was a big booster, having sponsored a CD-ROM conference every year since 1985. The first CD-based software products could already be purchased, although they were mostly uninspiring data dumps targeted at big business and academia rather than the polished, consumer-oriented works Stein aspired to make.

Moving from laser discs to CDs was not an unmitigated positive. The video that an attached laser-disc player could unspool so effortlessly had to be encoded digitally in order to go onto a CD, then decoded and pushed to a monitor in real time by the computer attached to the CD-ROM drive. It was important to keep the file size down; the 650 MB that could be packed onto a CD sounded impressive enough, but weren’t really that much at all when one started using them for raw video. At the same time, the compression techniques that one could practically employ were sharply limited by the computing horsepower available to decompress the video on the fly. The only way to wriggle between this rock and a hard place was to compromise — to compromise dramatically — on the fidelity of the video itself. Grainy, sometimes jerky video, often displayed in a window little larger than a postage stamp, would be the norm with CD-ROM for some time to come. This was more than a little ironic in the context of Voyager, a company whose other arm was so famed for the quality of its movie transfers to laser disc. Suffice to say that no Voyager CD-ROM would ever look quite as good as that National Gallery of Art Laserguide, much less the Criterion King Kong.

Still, it was clear that the necessary future of Voyager were CDs rather than laser discs. Looking for a proof of concept for an electronic book, Stein started to imagine a CD-ROM that would examine a piece of music in detail. He chose that theme, he likes to say only half jokingly, so that he could finally figure out what it was Alan Kay had been going on about in the seat next to his during all those airplane rides of theirs. But there was also another, more practical consideration: it was possible to create what was known as a “mixed mode” CDs, in which music and sound were stored in standard audio-CD format alongside other forms of data. These music tracks could be played back at full home-stereo fidelity by the CD-ROM drive itself, with minimal intervention from the attached computer. This usage scenario was, in other words, analogous to that of controlling an attached laser disc, albeit in this case it was only sound that could be played back so effortlessly at such glorious fidelity.

Thus it came to pass that on January 1, 1989, one Robert Winter, a 42-year-old music professor at UCLA, received a visitor at his Santa Monica bungalow: it was Bob Stein, toting an Apple CD-ROM drive at his side. “I thought it was the strangest-looking thing,” Winter says. He had known Stein for some seven years, ever since the latter had sat in on one of his classes, and had often complained to him about the difficulty of writing about music using only printed words on the page; it was akin to “dancing about architecture,” as a famous bit of folk wisdom put it. These complaints had caused Stein to tag him as “a multimedia kind of guy”; “I had no idea what he meant,” admits Winter. But now, Stein showed Winter how an Apple Macintosh equipped with HyperCard and a CD-ROM drive could provide a new, vastly better way of writing about music — by adding to the text the music itself for the reader to listen along with, plus graphics wherever they seemed necessary, with the whole topped off by that special sauce of hypertextual, associative interactivity. The professor took the bait: “I knew then and there that this was my medium.”

Robert Winter became such a star that he was hired by a company called Chinon to pitch their CD-ROM drives.

The two agreed to create a meticulous deconstruction of a towering masterpiece of classical music, Ludwig van Beethoven’s Ninth Symphony. Winter:

I simply asked myself, “What is it I would like to know?” It occurred to me that I would like to know what is happening as the piece is playing. I start [my users] at the bubble-bath level, then the score, then the commentary, and a more detailed commentary.

The heart of the program was the “Close Reading,” where one could read a description of each passage, then listen to it from the same screen — or flip to the score to see what it looked like in musical notation, or flip to a more detailed, technical explanation, always with the chance to listen just a button-click away. Lest the whole experience become too dry, Winter sprinkled it with the wit that had made him a favorite lecturer at his university and a frequent guest commentator on National Public Radio. He equated stability in music with “an apartment you can make the rent on”; conflict in music was “sparring with a boss who doesn’t know how to give strokes.” When it came time for that section of the symphony, the one which absolutely everyone can hum — i..e, the famous “Ode to Joy” — “We’ve arrived!” flashed in big letters on the screen. The text of the poem by Friedrich Schiller which Beethoven set to music for the choir to sing was naturally also included, in the original German and in an English translation. Winter even added a trivia game; answer a question correctly, and Beethoven would wink at you and say, “Sehr gut!” aloud.


“I don’t think anything is any good if it doesn’t have a point of view,” said Bob Stein on one occasion. In being so richly imbued with its maker Robert Winter’s personality and point of view, the CD Companion to Beethoven’s Ninth Symphony was a model for all of the Voyager CDs to come. “You don’t set his CD-ROMs aside when you’ve exhausted the gimmicks,” wrote Wired magazine of Winter’s interactive works years later. “You keep coming back to them. There always seems to be more intellectual matter – more substance – to uncover.” This too would be the case for Voyager’s CDs in general. They would really, earnestly engage with their subject matter, rather than being content to coast on the novelty of their medium like most of their peers. “It is one of the few producers to offer actual ideas on CD-ROM,” wrote Wired of Voyager itself.

Stein and Winter brought their finished product to the Macworld Expo of August of 1989, where it caused just as much of a sensation as had the National Gallery of Art laser disc nineteen months before. “People stood there for thirty minutes as if they were deer in front of headlights,” says Winter. He claims that many at the show bought CD-ROM drives just in order to run the CD Companion to Beethoven. If it was not quite the first consumer-oriented CD-ROM, it was among the most prominent during the format’s infancy. “We’ve finally seen what CD-ROM was made for!” said Bill Gates upon his first viewing of the program. Like all of its ilk, its initial sales were limited by the expensive hardware needed to run it, but it was written about again and again as an aspirational sign of the times that were soon to arrive. “It takes us up to Beethoven’s worktable and lays bare the whole creative process,” enthused the Los Angeles Herald. Voyager was off and running at last as a maker of electronic books.

Their products would never entirely escape from the aspirational ghetto for a variety of reasons, beginning with their esoteric, unabashedly intellectual subject matter, continuing with the availability of most of them only on the Macintosh (a computer with less than 10 percent of the overall market share), and concluding with the World Wide Web waiting there in the wings with a whole different interpretation of hypertext’s affordances. The CD Companion to Beethoven, which eventually sold 130,000 copies on the back of all the hype and a version for Microsoft Windows,[1]This version bore the title of Multimedia Beethoven. would remain the company’s most successful single product ever; the majority of the Voyager CD-ROMs would never break five digits, much less six, in total unit sales. Yet Stein would manage to keep the operation going for seven years by hook or by crook: by “reinvesting” the money turned over by the Criterion Collection, by securing grants and loans from the Markle Foundation and Apple, and by employing idealistic young people who were willing to work cheap; “If you’re over 30 at Voyager,” said one employee, “you feel like a camp counselor.” Thanks not least to this last factor, the average budget for a Voyager CD-ROM was only about $150,000.

During the first couple of years, Robert Winter’s CD-ROMs remained the bedrock of Voyager; in time, he created explorations of Dvorak, Mozart, Schubert, Richard Strauss, and Stravinsky in addition to Beethoven. By 1991, however, Voyager was entering its mature phase, pushing in several different directions. By 1994, it was publishing more than one new CD per month, on a bewildering variety of subjects.

In the next article, then, we’ll begin to look at this rather extraordinary catalog in closer detail. If any catalog of creative software is worth rediscovering, it’s this one.

(Sources: the book The DVD and the Study of Film: The Attainable Text by Mark Parker and Deborah Parker; Wired of December 1994 and July 1996; CD-ROM Today of June/July 1994; Macworld of November 1988; New York Times of November 8 1992; the 1988 episode of the Computer Chronicles television show entitled “HyperCard”; Phil Salvador’s online interview with Bob Stein. The majority of this article is drawn from a lengthy personal interview with Bob Stein and from his extensive online archives. Thank you for both, Bob!)

Footnotes

Footnotes
1 This version bore the title of Multimedia Beethoven.
 

Tags: ,

The 68000 Wars, Part 6: The Unraveling

Commodore International’s roots are in manufacturing, not computing. They’re used to making and selling all kinds of things, from calculators to watches to office furniture. Computers just happen to be a profitable sideline they stumbled into. Commodore International isn’t really a computer company; they’re a company that happens to make computers. They have no grand vision of computing in the future; Commodore International merely wants to make products that sell. Right now, the products they’re set up to sell happen to be computers and video games. Next year, they might be bicycles or fax machines.

The top execs at Commodore International don’t really understand why so many people love the Amiga. You see, to them, it’s as if customers were falling in love with a dishwasher or a brand of paper towels. Why would anybody love a computer? It’s just a chunk of hardware that we sell, fer cryin’ out loud.

— Amiga columnist “The Bandito,” Amazing Computing, January 1994

Commodore has never been known for their ability to arouse public support. They have never been noted for their ability to turn lemons into lemonade. But, they have been known to take a bad situation and create a disaster. At least there is some satisfaction in noting their consistency.

— Don Hicks, managing editor, Amazing Computing, July 1994

In the summer of 1992, loyal users of the Commodore Amiga finally heard a piece of news they had been anxiously awaiting for half a decade. At long last, Commodore was about to release new Amiga models sporting a whole new set of custom chips. The new models would, in other words, finally do more than tinker at the edges of the aging technology which had been created by the employees of little Amiga, Incorporated between 1982 and 1985. It was all way overdue, but, as they say, better late than never.

The story of just how this new chipset managed to arrive so astonishingly late is a classic Commodore tale of managerial incompetence, neglect, and greed, against which the company’s overtaxed, understaffed engineering teams could only hope to fight a rearguard action.



Commodore’s management had not woken up to the need to significantly improve the Amiga until the summer of 1988, after much of its technological lead over its contemporaries running MS-DOS and MacOS had already been squandered. Nevertheless, the engineers began with high hopes for what they called the “Advanced Amiga Architecture,” or the “AAA” chipset. It was to push the machine’s maximum screen resolution from 640 X 400 to 1024 X 768, while pushing its palette of available colors from 4096 to 16.8 million. The blitter and other pieces of custom hardware which made the machine so adept at 2D animation would be retained and vastly improved, even as the current implementation of planar graphics would be joined by “chunky-graphics” modes which were more suitable for 3D animation. Further, the new chips would, at the user’s discretion, do away with the flicker-prone interlaced video signal that the current Amiga used for vertical resolutions above 200 pixels, which made the machine ideal for desktop-video applications but annoyed the heck out of anyone trying to do anything else with it. And it would now boast eight instead of four separate sound channels, each of which would now offer 16-bit resolution — i.e., the same quality as an audio CD.

All of this was to be ready to go in a new Amiga model by the end of 1990. Had Commodore been able to meet that time table and release said model at a reasonable price point, it would have marked almost as dramatic an advance over the current state of the art in multimedia personal computing as had the original Amiga 1000 back in 1985.

Sadly, though, no plan at Commodore could long survive contact with management’s fundamental cluelessness. By this point, the research-and-development budget had been slashed to barely half what it had been when the company’s only products were simple 8-bit computers like the Commodore 64. Often only one engineer at a time was assigned to each of the three core AAA chips, and said engineer was more often than not young and inexperienced, because who else would work 80-hour weeks at the salaries Commodore paid? Throw in a complete lack of day-to-day oversight or management coordination, and you had a recipe for endless wheel-spinning. AAA fell behind schedule, then fell further behind, then fell behind some more.

Some fifteen months after the AAA project had begun, Commodore started a second chip-set project, which they initially called “AA.” The designation was a baseball metaphor rather than an acronym; the “AAA” league in American baseball is the top division short of the major leagues, while the “AA” league is one rung further down. As the name would indicate, then, the AA chipset was envisioned as a more modest evolution of the Amiga’s architecture, an intermediate step between the original chipset and AAA. Like the latter, AA would offer 16.8 million colors — of which 256 could be onscreen at once without restrictions, more with some trade-offs —  but only at a maximum non-interlaced resolution of 640 X 480, or 800 X 600 in interlace mode. Meanwhile the current sound system would be left entirely alone. On paper, even these improvements moved the Amiga some distance beyond the existing Wintel VGA standard — but then again, that world of technology wasn’t standing still either. Much depended on getting the AA chips out quickly.

But “quick” was an adjective which seldom applied to Commodore. First planned for release on roughly the same time scale that had once been envisioned for the AAA chipset, AA too fell badly beyond schedule, not least because the tiny engineering team was now forced to split their energies between the two projects. It wasn’t until the fall of 1992 that AA, now renamed to the “Advanced Graphics Architecture,” or “AGA,” made its belated appearance. That is to say, the stopgap solution to the Amiga’s encroaching obsolescence arrived fully two years after the comprehensive solution to the problem ought to have shipped. Such was life at Commodore.

Rather than putting the Amiga out in front of the competition, AGA at this late date could only move it into a position of rough parity with the majority of the so-called “Super VGA” graphics cards which had become fairly commonplace in the Wintel world over the preceding year or so. And with graphics technology evolving quickly in the consumer space to meet the demands of CD-ROM and full-motion video, even the Amiga’s parity wouldn’t last for long. The Amiga, the erstwhile pioneer of multimedia computing, was now undeniably playing catch-up against the rest of the industry.

The Amiga 4000

AGA arrived inside two new models which evoked immediate memories of the Amiga 2000 and 500 from 1987, the most successful products in the platform’s history. Like the old Amiga 2000, the new Amiga 4000 was the “professional” machine, shipping in a big case full of expansion slots, with 4 MB of standard memory, a large hard drive, and a 68040 processor running at 25 MHz, all for a street price of around $2600. Like the old Amiga 500, the Amiga 1200 was the “home” model, shipping in an all-in-one-case form factor without room for internal expansion, with 2 MB of standard memory, a floppy drive only, and a 68020 processor running at 14 MHz, all for a price of about $600.

The two models were concrete manifestations of what a geographically bifurcated computing platform the Amiga had become by the early 1990s. In effect, the Amiga 4000 was to be the new face of Amiga computing in North America; ditto the Amiga 1200 in Europe. Commodore would make only scattered, desultory attempts to sell each model outside of its natural market.



Although the Amiga 500 had once enjoyed some measure of success in the United States as a games machine and general-purpose home computer, those days were long gone by 1992. That year, MS-DOS and Windows accounted for 87 percent of all American computer-game sales and the Macintosh for 9 percent, while the Amiga was lumped rudely into the 4 percent labeled simply “other.” Small wonder that very few American games publishers still gave any consideration to the platform at all; what games were still available for the Amiga in North America must usually be acquired by mail order, often as pricey imports from foreign climes. Then, too, most of the other areas where the Amiga had once been a player, and as often as not a pioneer — computer-based art and animation, 3D modeling, music production, etc. — had also fallen by the wayside, with most of the slack for such artsy endeavors being picked up by the Macintosh.

The story of Eric Graham was depressingly typical of the trend. Back in 1986, Graham had created a stunning ray-traced 3D animation called The Juggler on the Amiga; it became a staple of shop windows, filling at least some of the gap left by Commodore’s inept marketing. User demand had then led him to create Sculpt 3D, one of the first two practical 3D modeling applications for a consumer-class personal computer, and release it through the publisher Byte by Byte in mid-1987. (The other claimant to the status of absolute first of the breed ran on the Amiga as well; it was called Videoscape 3D, and was released virtually simultaneously with Sculpt 3D). But by 1989 the latest Macintosh models had also become powerful enough to support Graham’s software. Therefore Byte by Byte and Graham decided to jump to that platform, which already boasted a much larger user base who tended to be willing to pay higher prices for their software. Orphaned on the Amiga, the Sculpt 3D line continued on the Mac until 1996. Thanks to it and many other products, the Mac took over the lead in the burgeoning field of 3D modeling. And as went 3D modeling, so went a dozen other arenas of digital creativity.

The one place where the Amiga’s toehold did prove unshakeable was desktop video, where its otherwise loathed interlaced graphics modes were loved for the way they let the machine sync up with the analog video-production equipment typical of the time: televisions, VCRs, camcorders, etc. From the very beginning, both professionals and a fair number of dedicated amateurs used Amigas for titling, special effects, color correction, fades and wipes, and other forms of post-production work. Amigas were used by countless television stations to display programming information and do titling overlays, and found their way onto the sets of such television and film productions as Amazing Stories, Max Headroom, and Robocop 2. Even as the Amiga was fading in many other areas, video production on the platform got an enormous boost in December of 1990, when an innovative little Kansan company called NewTek released the Video Toaster, a combination of hardware and software which NewTek advertised, with less hyperbole than you might imagine, as an “all-in-one broadcast studio in a box” — just add one Amiga. Now the Amiga’s production credits got more impressive still: Babylon 5, seaQuest DSV, Quantum Leap, Jurassic Park, sports-arena Jumbotrons all over the country. Amiga models dating from the 1980s would remain fixtures in countless local television stations until well after the millennium, when the transition from analog to digital transmission finally forced their retirement.

Ironically, this whole usage scenario stemmed from what was essentially an accidental artifact of the Amiga’s design; Jay Miner, the original machine’s lead hardware designer, had envisioned its ability to mix and match with other video sources not as a means of inexpensive video post-production but rather as a way of overlaying interactive game graphics onto the output from an attached laser-disc accessory, a technique that was briefly en vogue in videogame arcades. Nonetheless, the capability was truly a godsend, the only thing keeping the platform alive at all in North America.

On the other hand, though, it was hard not to lament a straitening of the platform’s old spirit of expansive, experimental creativity across many fields. As far as the market was concerned, the Amiga was steadily morphing from a general-purpose computer into a piece of niche technology for a vertical market. By the early 1990s, most of the remaining North American Amiga magazines had become all but indistinguishable from any other dryly technical trade journal serving a rigidly specialized readership. In a telling sign of the times, it was almost universally agreed that early sales of the Amiga 4000, which were rather disappointing even by Commodore’s recent standards, were hampered by the fact that it initially didn’t work with the Video Toaster. (An updated “Video Toaster 4000” wouldn’t finally arrive until a year after the Amiga 4000 itself.) Many users now considered the Amiga little more than a necessary piece of plumbing for the Video Toaster. NewTek and others sold turnkey systems that barely even mentioned the name “Commodore Amiga.” Some store owners whispered that they could actually sell a lot more Video Toaster systems that way. After all, Commodore was known to most of their customers only as the company that had made those “toy computers” back in the 1980s; in the world of professional video and film production, that sort of name recognition was worth less than no name recognition at all.

In Europe, meanwhile, the nature of the baggage that came attached to the Commodore name perhaps wasn’t all that different in the broad strokes, but it did carry more positive overtones. In sheer number of units sold, the Amiga had always been vastly more successful in Europe than in North America, and that trend accelerated dramatically in the 1990s. Across the pond, it enjoyed all the mass-market acceptance it lacked in its home country. It was particularly dominant in Britain and West Germany, two of the three biggest economies in Europe. Here, the Amiga was nothing more nor less than a great games machine. The sturdy all-in-one-case design of the Amiga 500 and, now, the Amiga 1200 placed it on a continuum with the likes of the Sinclair Spectrum and the Commodore 64. There was a lot more money to be made selling computers to millions of eager European teenagers than to thousands of sober American professionals, whatever the wildly different price tags of the individual machines. Europe was accounting for as much as 88 percent of Commodore’s annual revenues by the early 1990s.

And yet here as well the picture was less rosy than it might first appear. While Commodore sold almost 2 million Amigas in Europe during 1991 and 1992, sales were trending in the wrong direction by the end of that period. Nintendo and Sega were now moving into Europe with their newest console systems, and Microsoft Windows as well was fast gaining traction. Thus it comes as no surprise that the Amiga 1200, which first shipped in December of 1992, a few months after the Amiga 4000, was greeted with sighs of relief by Commodore’s European subsidiaries, followed by much nervous trepidation. Was the new model enough of an improvement to reverse the trend of declining sales and steal a march on the competition once again? Sega especially had now become a major player in Europe, selling videogame consoles which were both cheaper and easier to operate than an Amiga 1200, lacking as they did any pretense of being full-fledged computers.

If Commodore was facing a murky future on two continents, they could take some consolation in the fact that their old arch-rival Atari was, as usual, even worse off. The Atari Lynx, the handheld game console which Jack Tramiel had bilked Epyx out of for a song, was now the company’s one reasonably reliable source of revenue; it would sell almost 3 million units between 1989 and 1994. The Atari ST, the computer line which had caused such a stir when it had beat the original Amiga into stores back in 1985 but had been playing second fiddle ever since, offered up its swansong in 1992 in the form of the Falcon, a would-be powerhouse which, as always, lagged just a little bit behind the latest Amiga models’ capabilities. Even in Europe, where Atari, like Commodore, had a much stronger brand than in North America, the Falcon sold hardly at all. The whole ST line would soon be discontinued, leaving it unclear how Atari intended to survive after the Lynx faded into history. Already in 1992, their annual sales fell to $127 million — barely a seventh those of Commodore.

Still, everything was in flux, and it was an open question whether Commodore could continue to sell Amigas in Europe at the pace to which they had become accustomed. One persistent question dogging the Amiga 1200 was that of compatibility. Although the new chipset was designed to be as compatible as possible with the old, in practice many of the most popular old Amiga games didn’t work right on the new machine. This reality could give pause to any potential upgrader with a substantial library of existing games and no desire to keep two Amigas around the house. If your favorite old games weren’t going to work on the new machine anyway, why not try something completely different, like those much more robust and professional-looking Windows computers the local stores had just started selling, or one of those new-fangled videogame consoles?



The compatibility problems were emblematic of the way that the Amiga, while certainly not an antique like the 8-bit generation of computers, wasn’t entirely modern either. MacOS and Windows isolated their software environment from changing hardware by not allowing applications to have direct access to the bare metal of the computer; everything had to be passed through the operating system, which in turn relied on the drivers provided by hardware manufacturers to ensure that the same program worked the same way on a multitude of configurations. AmigaOS provided the same services, and its technical manuals as well asked applications to function this way — but, crucially, it didn’t require that they do so. European game programmers in particular had a habit of using AmigaOS only as a bootstrap. Doing so was more efficient than doing things the “correct” way; most of the audiovisually striking games which made the Amiga’s reputation would have been simply impossible using “proper” programming techniques. Yet it created a brittle software ecosystem which was ill-suited for the long term. Already before 1992 Amiga gamers had had to contend with software that didn’t work on machines with more than 512 K of memory, or with a hard drive attached, or with or without a certain minor revision of the custom chips, or any number of other vagaries of configuration. With the advent of the AGA chipset, such problems really came home to roost.

An American Amiga 1200. In the context of American computing circa 1992, it looked like a bizarrely antediluvian gadget. “Real” computers just weren’t sold in this style of all-in-one case anymore, with peripherals dangling messily off the side. It looked like a chintzy toy to Americans, whatever the capabilities hidden inside. A telling detail: notice the two blank keys on the keyboard, which were stamped with characters only in some continental European markets that needed them. Rather than tool up to produce more than one physical keyboard layout, Commodore just let them sit there in all their pointlessness on the American machines. Can you imagine Apple or IBM, or any other reputable computer maker, doing this? To Americans, and to an increasing number of Europeans as well, the Amiga 1200 just seemed… cheap.

Back in 1985, AmigaOS had been the very first consumer-oriented operating system to boast full-fledged preemptive multitasking, something that neither MacOS nor Microsoft Windows could yet lay claim to even in 1992; they were still forced to rely on cooperative multitasking, which placed them at the mercy of individual applications’ willingness to voluntarily cede time to others. Yet the usefulness of AmigaOS’s multitasking was limited by its lack of memory protection. Thanks to this lack, any individual program on the system could write, intentionally or unintentionally, all over the memory allocated to another; system crashes were a sad fact of life for the Amiga power user. AmigaOS also lacked a virtual-memory system that would have allowed more applications to run than the physical memory could support. In these respects and others — most notably its graphical interface, which still evinced nothing like the usability of Windows, much less the smooth elegance of the Macintosh desktop — AmigaOS lagged behind its rivals.

It is true that MacOS, dating as it did from roughly the same period in the evolution of the personal computer, was struggling with similar issues: trying to implement some form of multitasking where none at all had existed originally, kludging in support for virtual memory and some form of rudimentary memory protection. The difference was that MacOS was evolving, however imperfectly. While AmigaOS 3.0, which debuted with the AGA machines, did offer some welcome improvements in terms of cosmetics, it did nothing to address the operating system’s core failings. It’s doubtful whether anyone in Commodore’s upper management even knew enough about computers to realize they existed.

This quality of having one foot in each of two different computing eras dogged the platform in yet more ways. The very architectural approach of the Amiga — that of an ultra-efficient machine built around a set of tightly coupled custom chips — had become passé as Wintel and to a large extent even the Macintosh had embraced modular architectures where almost everything could live on swappable cards, letting users mix and match capabilities and upgrade their machines piecemeal rather than all at once. One might even say that it was down almost as much to the Amiga’s architectural philosophy as it was to Commodore’s incompetence that the machine had had such a devil of a time getting itself upgraded.

And yet the problems involved in upgrading the custom chips were as nothing compared to the gravest of all the existential threats facing the Amiga. It was common knowledge in the industry by 1992 that Motorola was winding down further development of the 68000 line, the CPUs at the heart of all Amigas. Indeed, Apple, whose Macintosh used the same CPUs, had seen the writing on the wall as early as the beginning of 1990, and had started projects to port MacOS to other architectures, using software emulation as a way to retain compatibility with legacy applications. In 1991, they settled on the PowerPC, a CPU developed by an unlikely consortium of Apple, IBM, and Motorola, as the future of the Macintosh. The first of the so-called “Power Macs” would debut in March of 1994. The whole transition would come to constitute one of the more remarkable sleights of hand in all computing history; the emulation layer combined with the ported version of MacOS would work so seamlessly that many users would never fully grasp what was really happening at all.

But Commodore, alas, was in no position to follow suit, even had they had the foresight to realize what a ticking time bomb the end of the 68000 line truly was. AmigaOS’s more easygoing attitude toward the software it enabled meant that any transition must be fraught with far more pain for the user, and Commodore had nothing like the resources Apple had to throw at the problem in any case. But of course, the thoroughgoing, eternal incompetence of Commodore’s management prevented them from even seeing the problem, much less doing anything about it. While everyone was obliviously rearranging the deck chairs on the S.S. Amiga, it was barreling down on an iceberg as wide as the horizon. The reality was that the Amiga as a computing platform now had a built-in expiration date. After the 68060, the planned swansong of the 68000 family, was released by Motorola in 1994, the Amiga would literally have nowhere left to go.

As it was, though, Commodore’s financial collapse became the more immediate cause that brought about the end of the Amiga as a vital computing platform. So, we should have a look at what happened to drive Commodore from a profitable enterprise, still flirting with annual revenues of $1 billion, to bankruptcy and dissolution in the span of less than two years.



During the early months of 1993, an initial batch of encouraging reports, stating that the Amiga 1200 had been well-received in Europe, was overshadowed by an indelibly Commodorian tale of making lemons out of lemonade. It turned out that they had announced the Amiga 1200 too early, then shipped it too late and in too small quantities the previous Christmas. Consumers had chosen to forgo the Amiga 500 and 600 — the latter being a recently introduced ultra-low-end model — out of the not-unreasonable belief that the generation of Amiga technology these models represented would soon be hopelessly obsolete. Finding the new model unavailable, they’d bought nothing at all — or, more likely, bought something from a competitor like Sega instead. The result was a disastrous Christmas season: Commodore didn’t yet have the new computers everybody wanted, and couldn’t sell the old computers they did have, which they’d inexplicably manufactured and stockpiled as if they’d had no inkling that the Amiga 1200 was coming. They lost $21.9 million in the quarter that was traditionally their strongest by far.

Scant supplies of the Amiga 1200 continued to devastate overall Amiga sales well after Christmas, leaving one to wonder why on earth Commodore hadn’t tooled up to manufacture sufficient quantities of a machine they had hoped and believed would be a big hit, for once with some justification. In the first quarter of 1993, Commodore lost a whopping $177.6 million on sales of just $120.9 million, thanks to a massive write-down of their inventory of older Amiga models piling up in warehouses. Unit sales of Amigas dropped by 25 percent from the same quarter of the previous year; Amiga revenues dropped by 45 percent, thanks to deep price cuts instituted to try to move all those moribund 500s and 600s. Commodore’s share price plunged to around $2.75, down from $11 a year before, $20 the year before that. Wall Street estimated that the whole company was now worth only $30 million after its liabilities had been subtracted — a drop of one full order of magnitude in the span of a year.

If anything, Wall Street’s valuation was generous. Commodore was now dragging behind them $115.3 million in debt. In light of this, combined with their longstanding reputation for being ethically challenged in all sorts of ways, the credit agencies considered them to be too poor a risk for any new loans. Already they were desperately pursuing “debt restructuring” with their major lenders, promising them, with little to back it up, that the next Christmas season would make everything right as rain again. Such hopes looked even more unfounded in light of the fact that Commodore was now making deep cuts to their engineering and marketing staffs — i.e., the only people who might be able to get them out of this mess. Certainly the AAA chipset, still officially an ongoing project, looked farther away than ever now, two and a half years after it was supposed to have hit the streets.

The Amiga CD32

It was for these reasons that the announcement of the Amiga CD32, the last major product introduction in Commodore’s long and checkered history, came as such a surprise to everyone in mid-1993. CD32 was perhaps the first comprehensively, unadulteratedly smart product Commodore had released since the Amiga 2000 and 500 back in 1987. It was as clever a leveraging of their dwindling assets as anyone could ask for. Rather than another computer, it was a games console, built around a double-speed CD-ROM drive. Commodore had tried something similar before, with the CDTV unit of two and a half years earlier, only to watch it go down in flames. For once, though, they had learned from their mistakes.

CDTV had been a member of an amorphously defined class of CD-based multimedia appliances for the living room — see also the Philips CD-I, the 3DO console, and the Tandy VIS — which had all been or would soon become dismal failures. Upon seeing such gadgets demonstrated, consumers, unimpressed by ponderous encyclopedias that were harder to use and less complete than the ones already on their bookshelves, trip-planning software that was less intuitive than the atlases already in their glove boxes, and grainy film clips that made their VCRs look high-fidelity, all gave vent to the same plaintive question: “But what good is it really?” The only CD-based console which was doing well was, not coincidentally, the only one which could give a clear, unequivocal answer to this question. The Sega Genesis with CD add-on was good for playing videogames, full stop.

Commodore followed Sega’s lead with the CD32. It too looked, acted, and played like a videogame console — the most impressive one on the market, with specifications far outshining the Sega CD. Whereas Commodore had deliberately obscured the CDTV’s technological connection to the Amiga, they trumpeted it with the CD32, capitalizing on the name’s association with superb games. At heart, the CD32 was just a repackaged Amiga 1200, in the same way that the CDTV had been a repackaged Amiga 500. Yet this wasn’t a problem at all. For all that it was a little underwhelming in the world of computers, the AGA chipset was audiovisually superior to anything the console world could offer up, while the 32-bit 68020 that served as the CD32’s brain gave it much more raw horsepower. Meanwhile the fact that at heart it was just an Amiga in a new form factor gave it a huge leg up with publishers and developers; almost any given Amiga game could be ported to the CD32 in a week or two. Throw in a price tag of less than $400 (about $200 less than the going price of a “real” Amiga 1200, if you could find one), and, for the first time in years, Commodore had a thoroughly compelling new product, with a measure of natural appeal to people who weren’t already members of the Amiga cult. Thanks to the walled-garden model of software distribution that was the norm in the console world, Commodore stood to make money not only on every CD32 sold but also from a licensing fee of $3 on every individual game sold for the console. If the CD32 really took off, it could turn into one heck of a cash cow. If only Commodore could have released it six months earlier, or have managed to remain financially solvent for six months longer, it might even have saved them.

As it was, the CD32 made a noble last stand for a company that had long made ignobility its calling card. Released in September of 1993 in Europe, it generated some real excitement, thanks not least to a surprisingly large stable of launch titles, fruit of that ease of porting games from the “real” Amiga models. Commodore sold CD32s as fast as they could make them that Christmas — which was unfortunately nowhere near as fast as they might have liked, thanks to their current financial straits. Nevertheless, in those European stores where CD32s were on-hand to compete with the Sega CD, the former often outsold the latter by a margin of four to one. Over 50,000 CD32s were sold in December alone.

The Atari Jaguar. There was some mockery, perhaps justifiable, of its “Jetsons” design aesthetic.

Ironically, Atari’s last act took much the same form as Commodore’s. In November of 1993, following a horrific third quarter in which they had lost $17.6 million on sales of just $4.4 million, they released a game console of their own, called the Jaguar, in North America. In keeping with the tradition dating back to 1985, it was cheaper than Commodore’s take on the same concept — its street price was under $250 — but not quite as powerful, lacking a CD-ROM drive. Suffering from a poor selection of games, as well as reliability problems and outright hardware bugs, the Jaguar faced an uphill climb; Atari shipped less than 20,000 of them in 1993. Nevertheless, the Tramiel clan confidently predicted that they would sell 500,000 units in 1994, and at least some people bought into the hype, sending Atari’s stock soaring to almost $15 even as Commodore’s continued to plummet.

For the reality was, the rapid unraveling of all other facets of Commodore’s business had rendered the question of the CD32’s success moot. The remaining employees who worked at the sprawling campus in West Chester, Pennsylvania, purchased a decade before when the VIC-20 and 64 were flying off shelves and Jack “Business is War” Tramiel was stomping his rival home-computer makers into dust, felt like dwarfs wandering through the ancient ruins of giants. Once there had been more than 600 employees here; now there were about 50. There was 10,000 square feet of space per employee in a facility where it cost $8000 per day just to keep the lights on. You could wander for hours through the deserted warehouses, shuttered production lines, and empty research labs without seeing another living soul. Commodore was trying to lease some of it out for an attractive rent of $4 per square foot, but, as with with most of their computers, nobody seemed all that interested. The executive staff, not wanting the stigma of having gone down with the ship on their resumes, were starting to jump for shore. Commodore’s chief financial officer threw up his hands and quit in the summer of 1993; the company’s president followed in the fall.

Apart from the CD32, for which they lacked the resources to manufacture enough units to meet demand, virtually none of the hardware piled up in Commodore’s European warehouses was selling at all anymore. In the third quarter of 1993, they lost $9.7 million, followed by $8 million in the fourth quarter, on sales of just $70 million. After a second disastrous Christmas in a row, it could only be a question of time.

In a way, it was the small things rather than the eye-popping financial figures which drove the point home. For example, the April 1994 edition of the New York World of Commodore show, for years already a shadow of its old vibrant self, was cancelled entirely due to lack of interest. And the Army and Air Force Exchange, which served as a storefront to American military personnel at bases all over the world, kicked Commodore off its list of suppliers because they weren’t paying their bills. It’s by a thousand little cuts like this one, each representing another sales opportunity lost, that a consumer-electronics company dies. At the Winter Consumer Electronics Show in January of 1994, at which Commodore did manage a tepid presence, their own head of marketing told people straight out that the Amiga had no future as a general-purpose computer; Commodore’s only remaining prospects, he said, lay with the American vertical market of video production and the European mass market of videogame consoles. But they didn’t have the money to continue building the hardware these markets were demanding, and no bank was willing to lend them any.

The proverbial straw which broke the camel’s back was a dodgy third-party patent relating to a commonplace programming technique used to keep a mouse pointer separate from the rest of the screen. Commodore had failed to pay the patent fee for years, the patent holder eventually sued, and in April of 1994 the court levied an injunction preventing Commodore from doing any more business at all in the United States until they paid up. The sum in question was a relatively modest $2.5 million, but Commodore simply didn’t have the money to give.

On April 29, 1994, in a briefly matter-of-fact press release, Commodore announced that they were going out of business: “The company plans to transfer its assets to unidentified trustees for the benefit of its creditors. This is the initial phase of an orderly voluntary liquidation.” And just like that, a company which had dominated consumer computing in the United States and much of Europe for a good part of the previous decade and a half was no more. The business press and the American public showed barely a flicker of interest; most of them had assumed that Commodore was already long out of business. European gamers reacted with shock and panic — few had realized how bad things had gotten for Commodore — but there was nothing to be done.

Thus it was that Atari, despite being chronically ill for a much longer period of time, managed to outlive Commodore in the end. Still, this isn’t to say that their own situation at the time of Commodore’s collapse was a terribly good one. When the reality hit home that the Jaguar probably wasn’t going to be a sustainable gaming platform at all, much less sell 500,000 units in 1994 alone, their stock plunged back down to less than $1 per share. In the aftermath, Atari limped on as little more than a patent troll, surviving by extracting judgments from other videogame makers, most notably Sega, for infringing on dubious intellectual property dating back to the 1970s. This proved to be an ironically more profitable endeavor for them than that of actually selling computers or game consoles. On July 30, 1996, the Tramiels finally cashed out, agreeing to merge the remnants of their company with JT Storage, a maker of hard disks, who saw some lingering value in the trademarks and the patents. It was a liquidation in all but name; only three Atari employees transitioned to the “merged” entity, which continued under the same old name of JT Storage.

And so disappeared the storied name of Atari and that of Tramiel simultaneously from the technology industry. Even as the trade magazines were publishing eulogies to the former, few were sorry to see the latter go, what with their long history of lawsuits, dirty dealing, and abundant bad faith. Jack Tramiel had purchased Atari in 1984 out of the belief that creating another phenomenon like the Commodore 64 — or for that matter the Atari VCS — would be easy. But the twelve years that followed were destined always to remain a footnote to his one extraordinary success, a cautionary tale about the dangers of conflating lucky timing and tactical opportunism with long-term strategic genius.

Even so, the fact does remain that the Commodore 64 brought affordable computing to millions of people all over the world. For that every one of those millions owes Jack Tramiel, who died in 2012, a certain debt of gratitude. Perhaps the kindest thing we can do for him is to end his eulogy there.



The story of the Amiga after the death of Commodore is long, confusing, and largely if not entirely dispiriting; for all these reasons, I’d rather not dwell on it at length here. Its most positive aspect is the surprisingly long commercial half-life the platform enjoyed in Europe, over the course of which game developers still found a receptive if slowly dwindling market ready to buy their wares. The last glossy newsstand magazine devoted to the Amiga, the British Amiga Active, didn’t publish its last issue until the rather astonishingly late date of November of 2001.

The Amiga technology itself first passed into the hands of a German PC maker known as Escom, who actually started manufacturing new Amiga 1200s for a time. In 1996, however, Escom themselves went bankrupt. The American PC maker Gateway 2000 became the last major company to bother with the aging technology when they bought it at the Escom bankruptcy auction. Afterward, though, they apparently had second thoughts; they did nothing whatsoever with it before selling it onward at a loss. From there, it passed into other, even less sure hands, selling always at a discount. There are still various projects bearing the Amiga name today, and I suspect they will continue until the generation who fell in love with the platform in its heyday have all expired. But these are little more than hobbyist endeavors, selling their products in minuscule numbers to customer motivated more by their nostalgic attachment to the Amiga name than by any practical need. It’s far from clear what the idea of an “Amiga computer” should even mean in 2020.

When the hardcore of the Amiga hardcore aren’t dreaming quixotically of the platform’s world-conquering return, they’re picking through the rubble of the past, trying to figure out where it all went wrong. Among a long roll call of petty incompetents in Commodore’s executive suites, two clear super-villains emerge: Irving Gould, Commodore’s chairman since the mid-1960s, and Mehdi Ali, his final hand-picked chief executive. Their mismanagement in the latter days of the company was so egregious that some have put it down to evil genius rather than idiocy. The typical hypothesis says that these two realized at some point in the very early 1990s that Commodore’s days were likely numbered, and that they could get more out of the company for themselves by running it into the ground than they could by trying to keep it alive. I usually have little time for such conspiracy theories; as far as I’m concerned, a good rule of thumb for life in general is never to attribute to evil intent what can just as easily be chalked up to good old human stupidity. In this case, though, there’s some circumstantial evidence lending at least a bit of weight to the theory.

The first and perhaps most telling piece of evidence is the two men’s ridiculously exorbitant salaries, even as their company was collapsing around them. In 1993, Mehdi Ali took home $2 million, making him the fourth highest-paid executive in the entire technology sector. Irving Gould earned $1.75 million that year — seventh on the list. Why were these men paying themselves as if they ran a thriving company when the reality was so very much the opposite? One can’t help but suspect that Gould at least, who owned 19 percent of Commodore’s stock, was trying to offset his losses on the one field by raising his personal salary on the other.

And then there’s the way that Gould, an enormously rich man whose personal net worth was much higher than that of all of Commodore by the end, was so weirdly niggardly in helping his company out of its financial jam. While he did loan fully $17.4 million back to Commodore, the operative word here is indeed “loan”: he structured his cash injections to ensure that he would be first in line to get his money back if and when the company went bankrupt, and stopped throwing good money after bad as soon as the threshold of the collateral it could offer up in exchange was exceeded. One can’t help but wonder what might have become of the CD32 if he’d been willing to go all-in to try to turn it into a success.

Of course, this is all rank speculation, which will quickly become libelous if I continue much further down this road. Suffice to say that questionable ethics were always an indelible part of Commodore. Born in scandal, the company would quite likely have ended in scandal as well if anyone in authority had been bothered enough by its anticlimactic bankruptcy to look further. I’d love to see what a savvy financial journalist could make of Commodore’s history. But, alas, I have neither the skill nor the resources for such a project, and the story is of little interest to the mainstream journalists of today. The era is past, the bodies are buried, and there are newer and bigger outrages to fill our newspapers.



Instead, then, I’ll conclude with two brief eulogies to mark the end of the Amiga’s role in this ongoing history. Rather than eulogizing in my own words, I’m going to use those of a true Amiga zealot: the anonymous figure known as “The Bandito,” whose “Roomers” columns in the magazine Amazing Computing were filled with cogent insights and nitty-gritty financial details every month. (For that reason, they’ve been invaluable sources for this series of articles.)

Jay Miner, the gentle genius, in 1990. In interviews like the one to which this photo was attached, he always seemed a little befuddled by the praise and love which Amiga users lavished upon him.

First, to Jay Miner, the canonical “father of the Amiga,” who died of the kidney disease he had been battling for most of his life on June 20, 1994, at age 62. If a machine can reflect the personality of a man, the Amiga certainly reflected his:

Jay was not only the inventive genius who designed the custom chips behind the Atari 800 and the Amiga, he also designed many more electronic devices, including a new pacemaker that allows the user to set their own heart rate (which allows them to participate in strenuous activities once denied to them). Jay was not only a brilliant engineer, he was a kind, gentle, and unassuming man who won the hearts of Amiga fans everywhere he went. Jay was continually amazed and impressed at what people had done with his creations, and he loved more than anything to see the joy people obtained from the Amiga.

We love you, Jay, for all the gifts that you have given to us, and all the fruits of your genius that you have shared with us. Rest in peace.

And now a last word on the Amiga itself, from the very last “Roomers” column, written by someone who had been there from the beginning:

The Amiga has left an indelible mark on the history of computing. [It] stands as a shining example of excellent hardware design. Its capabilities foreshadowed the directions of the entire computer industry: thousands of colors, multiple screen resolutions, multitasking, high-quality sound, fast animation, video capability, and more. It was the beauty and elegance of the hardware that sold the Amiga to so many millions of people. The Amiga sold despite Commodore’s neglect, despite their bumbling and almost criminal marketing programs. Developers wrote brilliantly for this amazing piece of hardware, creating software that even amazed the creators of the hardware. The Amiga heralded the change that’s even now transforming the television industry, with inexpensive CGI and video editing making for a whole new type of television program.

Amiga game software also changed the face of entertainment software. Electronic Arts launched themselves headlong into 16-bit entertainment software with their Amiga software line, which helped propel them into the $500 million giant they are today. Cinemaware’s Defender of the Crown showed people what computer entertainment could look like: real pictures, not blocky collections of pixels. For a while, the Amiga was the entertainment-software machine to have.

In light of all these accomplishments, the story of the Amiga really isn’t the tragedy of missed opportunities and unrealized potential that it’s so often framed as. The very design that made it able to do so many incredible things at such an early date — its tightly coupled custom chips, its groundbreaking but lightweight operating system — made it hard for the platform to evolve in the same ways that the less imaginative, less efficient, but modular Wintel and MacOS architectures ultimately did. While it lasted, however, it gave the world a sneak preview of its future, inspiring thousands who would go on to do good work on other platforms. We are all more or less the heirs to the vision embodied in the original Amiga Lorraine, whether we ever used a real Amiga or not. The platform’s most long-lived and effective marketing slogan, “Only Amiga Makes it Possible,” is of course no longer true. It is true, though, that the Amiga made many things possible first. May it stand forever in the annals of computing history alongside the original Apple Macintosh as one of the two most visionary computers of its generation. For without these two computers — one of them, alas, more celebrated than the other — the digital world that we know today would be a very different place.

(Sources: the books Commodore: The Final Years by Brian Bagnall and my own The Future Was Here; Amazing Computing of November 1992, December 1992, January 1993, February 1993, March 1993, April 1993, May 1993, June 1993, July 1993, September 1993, October 1993, November 1993, December 1993, January 1994, February 1994, March 1994, April 1994, May 1994, June 1994, July 1994, August 1994, September 1994, October 1994, and February 1995; Byte of January 1993; Amiga User International of June 1988; Electronic Gaming Monthly of June 1995; Next Generation of December 1996. My thanks to Eric Graham for corresponding with me about The Juggler and Sculpt 3D years ago when I was writing my book on the Amiga.

Those wishing to read about the Commodore story from the perspective of the engineers in the trenches, who so often accomplished great things in less than ideal conditions, should turn to Brian Bagnall’s full “Commodore Trilogy”: A Company on the Edge, The Amiga Years, and The Final Years.)

 

Tags: , , ,

The 68000 Wars, Part 5: The Age of Multimedia

A group of engineers from Commodore dropped in unannounced on the monthly meeting of the San Diego Amiga Users Group in April of 1988. They said they were on their way to West Germany with some important new technology to share with their European colleagues. With a few hours to spare before they had to catch their flight, they’d decided to share it with the user group’s members as well.

They had with them nothing less than the machine that would soon be released as the next-generation Amiga: the Amiga 3000. From the moment they powered it up to display the familiar Workbench startup icon re-imagined as a three-dimensional ray-traced rendering, the crowd was in awe. The new model sported a 68020 processor running at more than twice the clock speed of the old 68000, with a set of custom chips redesigned to match its throughput; graphics in 2 million colors instead of 4096, shown at non-interlaced — read, non-flickering — resolutions of 640 X 400 and beyond; an AmigaOS 2.0 Workbench that looked far more professional than the garish version 1.3 that was shipping with current Amigas. The crowd was just getting warmed up when the team said they had to run. They did, after all, have a plane to catch.

Word spread like crazy over the online services. Calls poured in to Commodore’s headquarters in West Chester, Pennsylvania, but they didn’t seem to know what any of the callers were talking about. Clearly this must be a very top-secret project; the engineering team must have committed a major breach of protocol by jumping the gun as they had. Who would have dreamed that Commodore was already in the final stages of a project which the Amiga community had been begging them just to get started on?

Who indeed? The whole thing was a lie. The tip-off was right there in the April date of the San Diego Users Group Meeting. The president of the group, along with a few co-conspirators, had taken a Macintosh II motherboard and shoehorned it into an Amiga 2000 case. They’d had “Amiga 3000” labels typeset and stuck them on the case, and created some reasonable-looking renderings of Amiga applications, just enough to get them through the brief amount of time their team of “Commodore engineers” — actually people from the nearby Los Angeles Amiga Users Group — would spend presenting the package. When the truth came out, some in the Amiga community congratulated the culprits for a prank well-played, while others were predictably outraged. What hurt more than the fact that they had been fooled was the reality that a Macintosh that was available right now had been able to impersonate an Amiga that existed only in their dreams. If that wasn’t an ominous sign for their favored platform’s future, it was hard to say what would be.

Of course, this combination of counterfeit hardware and sketchy demos, no matter how masterfully acted before the audience, couldn’t have been all that convincing to a neutral observer with a modicum of skepticism. Like all great hoaxes, this one succeeded because it built upon what its audience already desperately wanted to believe. In doing so, it inadvertently provided a preview of what it would mean to be an Amiga user in the future: an ongoing triumph of hope over hard-won experience. It’s been said before that the worst thing you can do is to enter into a relationship in the hope that you will be able to change the other party. Amiga users would have reason to learn that lesson over and over again: Commodore would never change. Yet many would never take the lesson to heart. To be an Amiga user would be always to be fixated upon the next shiny object out there on the horizon, always to be sure this would be the thing that would finally turn everything around, only to be disappointed again and again.

Hoaxes aside, rumors about the Amiga 3000 had been swirling around since the introduction of the 500 and 2000 models in 1987. But for a long time a rumor was all the new machine was, even as the MS-DOS and Macintosh platforms continued to evolve apace. Commodore’s engineering team was dedicated and occasionally brilliant, but their numbers were tiny in comparison to those of comparable companies, much less bigger ones like Apple and IBM, the latter of whose annual research budget was greater than Commodore’s total sales. And Commodore’s engineers were perpetually underpaid and underappreciated by their managers to boot. The only real reason for a top-flight engineer to work at Commodore was love of the Amiga itself. In light of the conditions under which they were forced to work, what the engineering staff did manage to accomplish is remarkable.

After the crushing disappointment that had been the 1989 Christmas season, when Commodore’s last and most concerted attempt to break the Amiga 500 into the American mainstream had failed, it didn’t take hope long to flower again in the new year. “The chance for an explosive Amiga market growth is still there,” wrote Amazing Computing at that time, in a line that could have summed up the sentiment of every issue they published between 1986 and 1994.

Still, reasons for optimism seemingly did still exist. For one thing, Commodore’s American operation had another new man in charge, an event which always brought with it the hope that the new boss might not prove the same as the old boss. Replacing the unfortunately named Max Toy was Harold Copperman, a real, honest-to-goodness computer-industry veteran, coming off a twenty-year stint with IBM, followed by two years with Apple; he had almost literally stepped offstage from the New York Mac Business Expo, where he had introduced John Sculley to the speaker’s podium, and into his new office at Commodore. With the attempt to pitch the Amiga 500 to low-end users as the successor to the Commodore 64 having failed to gain any traction, the biggest current grounds for optimism was that Copperman, whose experience was in business computers, could make inroads into that market for the higher-end Amiga models. Rumor had it that the dismissal of Toy and the hiring of Copperman had occurred following a civil war that had riven the company, with one faction — Toy apparently among them — saying Commodore should de-emphasize the Amiga in favor of jumping on the MS-DOS bandwagon, while the other faction saw little future — or, perhaps better said, little profit margin — in becoming just another maker of commodity clones. If you were an Amiga fan, you could at least breathe a sigh of relief that the right side had won out in that fight.

The Amiga 3000

It was in that hopeful spring of 1990 that the real Amiga 3000, a machine custom-made for the high-end market, made its bow. It wasn’t a revolutionary update to the Amiga 2000 by any means, but it did offer some welcome enhancements. In fact, it bore some marked similarities to the hoax Amiga 3000 of 1988. For instance, replacing the old 68000 was a 32-bit 68030 processor, and replacing AmigaOS 1.3 was the new and much-improved — both practically and aesthetically — AmigaOS 2.0. The flicker of the interlaced graphics modes could finally be a thing of the past, at least if the user sprang for the right type of monitor, and a new “super-high resolution” mode of 1280 X 400 was available, albeit with only four onscreen colors. The maximum amount of “chip memory” — memory that could be addressed by the machine’s custom chips, and thus could be fully utilized for graphics and sound — had already increased from 512 K to 1 MB with the release of a “Fatter Agnus” chip, which could be retrofitted into older examples of the Amiga 500 and 2000, in 1989. Now it increased to 2 MB with the Amiga 3000.

The rather garish and toy-like AmigaOS 1.3 Workbench.

The much slicker Workbench 2.0.

So, yes, the Amiga 3000 was very welcome, as was any sign of technological progress. Yet it was also hard not to feel a little disappointed that, five years after the unveiling of the first Amiga, the platform had only advanced this far. The hard fact was that Commodore’s engineers, forced to work on a shoestring as they were, were still tinkering at the edges of the architecture that Jay Miner and his team had devised all those years before rather than truly digging into it to make the more fundamental changes that were urgently needed to keep up with the competition. The interlace flicker was eliminated, for instance, not by altering the custom chips themselves but by hanging an external “flicker fixer” onto the end of the bus to de-interlace the interlaced output they still produced before it reached the monitor. And the custom chips still ran no faster than they had in the original Amiga, meaning the hot new 68030 had to slow down to a crawl every time it needed to access the chip memory it shared with them. The color palette remained stuck at 4096 shades, and, with the exception of the new super-high resolution mode, whose weirdly stretched pixels and four colors limited its usability, the graphics modes as a whole remained unchanged. Amiga owners had spent years mocking the Apple Macintosh and the Atari ST for their allegedly unimaginative, compromised designs, contrasting them continually with Jay Miner’s elegant dream machine. Now, that argument was getting harder to make; the Amiga too was starting to look a little compromised and inelegant.

Harold Copperman personally introduced the Amiga 3000 in a lavish event — lavish at least by Commodore’s standards — held at New York City’s trendy Palladium nightclub. With CD-ROM in the offing and audiovisual standards improving rapidly across the computer industry, “multimedia” stood with the likes of “hypertext” as one of the great buzzwords of the age. Commodore was all over it, even going so far as to name the event “Multimedia Live!” From Copperman’s address:

It’s our turn. It’s our time. We had the technology four and a half years ago. In fact, we had the product ready for multimedia before multimedia was ready for a product. Today we’re improving the technology, and we’re in the catbird seat. It is our time. It is Commodore’s time.

I’m at Commodore just as multimedia becomes the most important item in the marketplace. Once again I’m with the leader. Of course, in this industry a leader doesn’t have any followers; he just has a lot of other companies trying to pass him by. But take a close look: the other companies are talking multimedia, but they’re not doing it. They’re a long way behind Commodore — not even close.

Multimedia is a first-class way for conveying a message because it takes the strength of the intellectual content and adds the verve — the emotion-grabbing, head-turning, pulse-raising impact that comes from great visuals plus a dynamic soundtrack. For everyone with a message to deliver, it unleashes extraordinary ability. For the businessman, educator, or government manager, it turns any ordinary meeting into an experience.

In a way, this speech was cut from the same cloth as the Amiga 3000 itself. It was certainly a sign of progress, but was it progress enough? Even as he sounded more engaged and more engaging than had plenty of other tepid Commodore executives, Copperman inadvertently pointed out much of what was still wrong with the organization he helmed. He was right that Commodore had had the technology to do multimedia for a long time; as I’ve argued at length elsewhere, the Amiga was in fact the world’s first multimedia personal computer, all the way back in 1985. Still, the obvious question one is left with after reading the first paragraph of the extract above is why, if Commodore had the technology to do multimedia four and a half years ago, they’ve waited until now to tell anyone about it. In short, why is the the world of 1990 “ready” for multimedia when the world of 1985 wasn’t? Contrary to Copperman’s claim about being a leader, Commodore’s own management had begun to evince an understanding of what the Amiga was and what made it special only after other companies had started building computers similar to it. Real business leaders don’t wait around for the world to decide it’s ready for their products; they make products the world doesn’t yet know it needs, then tell it why it needs them. Five years after being gifted with the Amiga, which stands alongside the Macintosh as one of the two most visionary computers of the 1980s precisely because of its embrace of multimedia, Commodore managed at this event to give every impression that they were the multimedia bandwagon jumpers.

The Amiga 3000 didn’t turn into the game changer the faithful were always dreaming of. It sold moderately, mostly to the established Amiga hardcore, but had little obvious effect on the platform’s overall marketplace position. Harold Copperman was blamed for the disappointment, and was duly fired by Irving Gould, the principal shareholder and ultimate authority at Commodore, at the beginning of 1991. The new company line became an exact inversion of that which had held sway at the time of the Amiga 3000’s introduction: Copperman’s expertise was business computing, but Commodore’s future lay in consumer computing. Jim Dionne, head of Commodore’s Canadian division and supposedly an expert consumer marketer, was brought in to replace him.

An old joke began to make the rounds of the company once again. A new executive arrives at his desk at Commodore and finds three envelopes in the drawer, each labelled “open in case of emergency” and numbered one, two, and three. When the company gets into trouble for the first time on his watch, he opens the first envelope. Inside is a note: “Blame your predecessor.” So he does, and that saves his bacon for a while, but then things go south again. He opens the second envelope: “Blame your vice-presidents.” So he does, and gets another lease on life, but of course it only lasts a little while. He opens the third envelope. “Prepare three envelopes…” he begins to read.

Yet anyone who happened to be looking closely might have observed that the firing of Copperman represented something more than the usual shuffling of the deck chairs on the S.S. Commodore. Upon his promotion, it was made clear to Jim Dionne that he was to be held on a much shorter leash than his predecessors, his authority carefully circumscribed. Filling the power vacuum was one Mehdi Ali, a lawyer and finance guy who had come to Commodore a couple of years before as a consultant and had since insinuated himself more and more with Irving Gould. Now he advanced to the title of president of Commodore International, Gould’s right-hand man in running the global organization; indeed, he seemed to be calling far more shots these days than his globe-trotting boss, who never seemed to be around when you needed him anyway. Ali’s rise would not prove a happy event for anyone who cared about the long-term health of the company.

For now, though, the full import of the changes in Commodore’s management structure was far from clear. Amiga users were on to the next Great White Hope, one that in fact had already been hinted at in the Palladium as the Amiga 3000 was being introduced. Once more “multimedia” would be the buzzword, but this time the focus would go back to the American consumer market Commodore had repeatedly failed to capture with the Amiga 500. The clue had been there in a seemingly innocuous, almost throwaway line from the speech delivered to the Palladium crowd by C. Lloyd Mahaffrey, Commodore’s director of marketing: “While professional users comprise the majority of the multimedia-related markets today, future plans call for penetration into the consumer market as home users begin to discover the benefits of multimedia.”

Commodore’s management, (proud?) owners of the world’s first multimedia personal computer, had for most of the latter 1980s been conspicuous by their complete disinterest in their industry’s initial forays into CD-ROM, the storage medium that, along with the graphics and sound hardware the Amiga already possessed, could have been the crowning piece of the platform’s multimedia edifice. The disinterest persisted in spite of the subtle and eventually blatant hints that were being dropped by people like Cinemaware’s Bob Jacob, whose pioneering “interactive movies” were screaming to be liberated from the constraints of 880 K floppy disks.

In 1989, a tiny piece of Commodore’s small engineering staff — described as “mavericks” by at least one source — resolved to take matters into their own hands, mating an Amiga with a CD-ROM drive and preparing a few demos designed to convince their managers of the potential that was being missed. Management was indeed convinced by the demo — but convinced to go in a radically different direction from that of simply making a CD-ROM drive that could be plugged into existing Amigas.

The Dutch electronics giant Philips had been struggling for what seemed like forever to finish something they envisioned as a whole new category of consumer electronics: a set-top box for the consumption of interactive multimedia content on CD. They called it CD-I, and it was already very, very late. Originally projected for release in time for the Christmas of 1987, its constant delays had left half the entertainment-software industry, who had invested heavily in the platform, in limbo on the whole subject of CD-ROM. What if Commodore could steal Phillips’s thunder by combining a CD-ROM drive with the audiovisually capable Amiga architecture not in a desktop computer but in a set-top box of their own? This could be the magic bullet they’d been looking for, the long-awaited replacement for the Commodore 64 in American living rooms.

The industry’s fixation on these CD-ROM set-top boxes — a fixation which was hardly confined to Phillips and Commodore alone — perhaps requires a bit of explanation. One thing these gadgets were not, at least if you listened to the voices promoting them, was game consoles. The set-top boxes could be used for many purposes, from displaying multimedia encyclopedias to playing music CDs. And even when they were used for pure interactive entertainment, it would be, at least potentially, adult entertainment (a term that was generally not meant in the pornographic sense, although some were already muttering about the possibilities that lurked therein as well). This was part and parcel of a vision that came to dominate much of digital entertainment between about 1989 and 1994: that of a sort of grand bargain between Northern and Southern California, a melding of the new interactive technologies coming out of Silicon Valley with the movie-making machine of Hollywood. Much of television viewing, so went the argument, would become interactive, the VCR replaced with the multimedia set-top box.

In light of all this conventional wisdom, Commodore’s determination to enter the fray — effectively to finish the job that Phillips couldn’t seem to — can all too easily be seen as just another example of the me-too-ism that had clung to their earlier multimedia pronouncements. At the time, though, the project was exciting enough that Commodore was able to lure quite a number of prominent names to work with them on it. Carl Sassenrath, who had designed the core of the original AmigaOS — including its revolutionary multitasking capability — signed on again to adapt his work to the needs of a set-top box. (“In many ways, it was what we had originally dreamed for the Amiga,” he would later say of the project, a telling quote indeed.) Jim Sachs, still the most famous of Amiga artists thanks to his work on Cinemaware’s Defender of the Crown, agreed to design the look of the user interface. Reichart von Wolfsheild and Leo Schwab, both well-known Amiga developers, also joined. And for the role of marketing evangelist Commodore hired none other than Nolan Bushnell, the founder almost two decades before of Atari, the very first company to place interactive entertainment in American living rooms. The project as a whole was placed in the capable hands of Gail Wellington, known throughout the Amiga community as the only Commodore manager with a dollop of sense. The gadget itself came to be called CDTV — an acronym, Commodore would later claim in a part of the sales pitch that fooled no one, for “Commodore Dynamic Total Vision.”

Nolan Bushnell, Mr. Atari himself, plugs CDTV at a trade show.

Commodore announced CDTV at the Summer Consumer Electronics Show in June of 1990, inviting selected attendees to visit a back room and witness a small black box, looking for all the world like a VCR or a stereo component, running some simple demos. From the beginning, they worked hard to disassociate the product from the Amiga and, indeed, from computers in general. The word “Amiga” appeared nowhere on the hardware or anywhere on the packaging, and if all went according to plan CDTV would be sold next to televisions and stereos in department stores, not in computer shops. Commodore pointed out that everything from refrigerators to automobiles contained microprocessors these days, but no one called those things computers. Why should CDTV be any different? It required no monitor, instead hooking up to the family television set. It neither included nor required a keyboard — much industry research had supposedly proved that non-computer users feared keyboards more than anything else — nor even a mouse, being controlled entirely through a remote control that looked pretty much like any other specimen of same one might find between the cushions of a modern sofa. “If you know how to change TV channels,” said a spokesman, “you can take full advantage of CDTV.” It would be available, Commodore claimed, before the Christmas of 1990, which should be well before CD-I despite the latter’s monumental head start.

That timeline sounded overoptimistic even when it was first announced, and few were surprised to see the launch date slip into 1991. But the extra time did allow a surprising number of developers to jump aboard the CDTV train. Commodore had never been good at developer relations, and weren’t terribly good at it now; developers complained that the tools Commodore provided were always late and inadequate and that help with technical problems wasn’t easy to come by, while financial help was predictably nonexistent. Still, lots of CD-I projects had been left in limbo by Phillips’s dithering and were attractive targets for adaptation to CDTV, while the new platform’s Amiga underpinnings made it fairly simple to port over extant Amiga games like SimCity and Battle Chess. By early 1991, Commodore could point to about fifty officially announced CDTV titles, among them products from such heavy hitters as Grolier, Disney, Guinness (the publisher, not the beer company), Lucasfilm, and Sierra. This relatively long list of CDTV developers certainly seemed a good sign, even if not all of the products they proposed to create looked likely to be all that exciting, or perhaps even all that good. Plenty of platforms, including the original Amiga, had launched with much less.

While the world — or at least the Amiga world — held its collective breath waiting for CDTV’s debut, the charismatic Nolan Bushnell did what he had been hired to do: evangelize like crazy. “What we are really trying to do is make multimedia a reality, and I think we’ve done that,” he said. The hyperbole was flying thick and fast from all quarters. “This will change forever the way we communicate, learn, and entertain,” said Irving Gould. Not to be outdone, Bushnell noted that “books were great in their day, but books right now don’t cut it. They’re obsolete.” (Really, why was everyone so determined to declare the death of the book during this period?)

CDTV being introduced at the 1991 World of Amiga show. Doing the introducing is Gail Wellington, head of the CDTV project and one of the unsung heroes of Commodore.

The first finished CDTV units showed up at the World of Amiga show in New York City in April of 1991; Commodore sold their first 350 to the Amiga faithful there. A staggered roll-out followed: to five major American cities, Canada, and the Commodore stronghold of Britain in May; to France, Germany, and Italy in the summer; to the rest of the United States in time for Christmas. With CD-I now four years late, CDTV thus became the first CD-ROM-based set-top box you could actually go out and buy. Doing so would set you back just under $1000.

The Amiga community, despite being less than thrilled by the excision of all mention of their platform’s name from the product, greeted the launch with the same enthusiasm they had lavished on the Amiga 3000, their Great White Hope of the previous year, or for that matter the big Christmas marketing campaign of 1989. Amazing Computing spoke with bated breath of CDTV becoming the “standard for interactive multimedia consumer hardware.”

“Yes, but what is it for?” These prospective customers’ confusion is almost palpable.

Alas, there followed a movie we’ve already seen many times. Commodore’s marketing was ham-handed as usual, declaring CDTV “nothing short of revolutionary” but failing to describe in clear, comprehensible terms why anyone who was more interested in relaxing on the sofa than fomenting revolutions might actually want one. The determination to disassociate CDTV from the scary world of computers was so complete that the computer magazines weren’t even allowed advance models; Amiga Format, the biggest Amiga magazine in Britain at the time with a circulation of more than 160,000, could only manage to secure their preview unit by making a side deal with a CDTV developer. CDTV units were instead sent to stereo magazines, who shrugged their shoulders at this weird thing this weird computer company had sent them and returned to reviewing the latest conventional CD players. Nolan Bushnell, the alleged marketing genius who was supposed to be CDTV’s ace in the hole, talked a hyperbolic game at the trade shows but seemed otherwise disengaged, happy just to show up and give his speeches and pocket his fat paychecks. One could almost suspect — perish the thought! — that he had only taken this gig for the money.

In the face of all this, CDTV struggled mightily to make any headway at all. When CD-I hit the market just before Christmas, boasting more impressive hardware than CDTV for roughly the same price, it only made the hill that much steeper. Commodore now had a rival in a market category whose very existence consumers still obstinately refused to recognize. As an established maker of consumer electronics in good standing with the major retailers — something Commodore hadn’t been since the heyday of the Commodore 64 — Phillips had lots of advantages in trying to flog their particular white elephant, not to mention an advertising budget their rival could only dream of. CD-I was soon everywhere, on store shelves and in the pages of the glossy lifestyle magazines, while CDTV was almost nowhere. Commodore did what they could, cutting the list price of CDTV to less than $800 and bundling with it The New Grolier Encyclopedia and the smash Amiga game Lemmings. It didn’t help. After an ugly Christmas season, Nolan Bushnell and the other big names all deserted the sinking ship.

Even leaving aside the difficulties inherent in trying to introduce people to an entirely new category of consumer electronics — difficulties that were only magnified by Commodore’s longstanding marketing ineptitude — CDTV had always been problematic in ways that had been all too easy for the true believers to overlook. It was clunky in comparison to CD-I, with a remote control that felt awkward to use, especially for games, and a drive which required that the discs first be placed into an external holder before being loaded into the unit proper. More fundamentally, the very re-purposing of old Amiga technology that had allowed it to beat CD-I to market made it an even more limited platform than its rival for running the sophisticated adult entertainments it was supposed to have enabled. Much of the delay in getting CD-I to market had been the product of a long struggle to find a way of doing video playback with some sort of reasonable fidelity. Even the released CD-I performed far from ideally in this area, but it did better than CDTV, which at best — at best, mind you — might be able to fill about a third of the television screen with low-resolution video running at a choppy twelve frames per second. It was going to be hard to facilitate a union of Silicon Valley and Hollywood with technology like that.

None of CDTV’s problems were the fault of the people who had created it, who had, like so many Commodore engineers before and after them, been asked to pull off a miracle on a shoestring. They had managed to create, if not quite a miracle, something that worked far better than it had a right to. It just wasn’t quite good enough to overcome the marketing issues, the competition from CD-I, and the marketplace confusion engendered by an interactive set-top box that said it wasn’t a game console but definitely wasn’t a home computer either.

CDTV could be outfitted with a number of accessories that turned it into more of a “real” computer. Still, those making software for the system couldn’t count on any of these accessories being present, which served to greatly restrict their products’ scope of possibility.

Which isn’t to say that some groundbreaking work wasn’t done by the developers who took a leap of faith on Commodore — almost always a bad bet in financial terms — and produced software for the platform. CDTV’s early software catalog was actually much more impressive than that of CD-I, whose long gestation had caused so many initially enthusiastic developers to walk away in disgust. The New Grolier Encyclopedia was a true multimedia dictionary; the entry for John F. Kennedy, for example, included not only a textual biography and photos to go along with it but audio excerpts from his most famous speeches. The American Heritage Dictionary also offered images where relevant, along with an audio pronunciation of every single word. American Vista: The Multimedia U.S. Atlas boasted lots of imagery of its own to add flavor to its maps, and could plan a route between any two points in the country at the click of a button. All of these things may sound ordinary today, but in a way that very modern ordinariness is a testament to what pioneering products these really were. They did in fact present an argument that, while others merely talked about the multimedia future, Commodore through CDTV was doing it — imperfectly and clunkily, yes, but one has to start somewhere.

One of the most impressive CDTV titles of all marked the return of one of the Amiga’s most beloved icons. After designing the CDTV’s menu system, the indefatigable Jim Sachs returned to the scene of his most famous creation. Really a remake rather than a sequel, Defender of the Crown II reintroduced many of the additional graphics and additional tactical complexities that had been excised from the original in the name of saving time, pairing them with a full orchestral soundtrack, digitized sound effects, and a narrator to detail the proceedings in the appropriate dulcet English accent. It was, Sachs said, “the game the original Defender of the Crown was meant to be, both in gameplay and graphics.” He did almost all of the work on this elaborate multimedia production all by himself, farming out little more than the aforementioned narration, and Commodore themselves released the game, having acquired the right to do so from the now-defunct Cinemaware at auction. While, as with the original, its long-term play value is perhaps questionable, Defender of the Crown II even today still looks and sounds mouth-wateringly gorgeous.


If any one title on CDTV was impressive enough to sell the machine by itself, this ought to be have been it. Unfortunately, it didn’t appear until well into 1992, by which time CDTV already had the odor of death clinging to it. The very fact that Commodore allowed the game to be billed as the sequel to one so intimately connected to the Amiga’s early days speaks to a marketing change they had instituted to try to breathe some life back into the platform.

The change was born out of an insurrection staged by Commodore’s United Kingdom branch, who always seemed to be about five steps ahead of the home office in any area you cared to name. Kelly Sumner, managing director of Commodore UK:

We weren’t involved in any of the development of CDTV technology; that was all done in America. We were taking the lead from the corporate company. And there was a concrete stance of “this is how you promote it, this is the way forward, don’t do this, don’t do that.” So, that’s what we did.

But after six or eight months we basically turned around and said, “You don’t know what you’re talking about. It ain’t going to go anywhere, and if it does go anywhere you’re going to have to spend so much money that it isn’t worth doing. So, we’re going to call it the Amiga CDTV, we’re going to produce a package with disk drives and such like, and we’re going to promote it like that. People can understand that, and you don’t have to spend so much money.”

True to their word, Commodore UK put together what they called “The Multimedia Home Computer Pack,” combining a CDTV unit with a keyboard, a mouse, an external disk drive, and the software necessary to use it as a conventional Amiga as well as a multimedia appliance — all for just £100 more than a CDTV unit alone. Commodore’s American operation grudgingly followed their lead, allowing the word “Amiga” to creep back into their presentations and advertising copy.

Very late in the day, Commodore finally began acknowledging and even celebrating CDTV’s Amigahood.

But it was too late — and not only for CDTV but in another sense for the Amiga platform itself. The great hidden cost of the CDTV disappointment was the damage it did to the prospects for CD-ROM on the Amiga proper. Commodore had been so determined to position CDTV as its own thing that they had rejected the possibility of equipping Amiga computers as well with CD-ROM drives, despite the pleas of software developers and everyday customers alike. A CD-ROM drive wasn’t officially mated to the world’s first multimedia personal computer until the fall of 1992, when, with CDTV now all but left for dead, Commodore finally started shipping an external drive that made it possible to run most CDTV software, as well as CD-based software designed specifically for Amiga computers, on an Amiga 500. Even then, Commodore provided no official CD-ROM solution for Amiga 2000 and 3000 owners, forcing them to cobble together third-party adapters that could interface with drives designed for the Macintosh. The people who owned the high-end Amiga models, of course, were the ones working in the very cutting-edge fields that cried out for CD-ROM.

It’s difficult to overstate the amount of damage the Amiga’s absence from the CD-ROM party, the hottest ticket in computing at the time, did to the platform’s prospects. It single-handedly gave the lie to every word in Harold Copperman’s 1990 speech about Commodore being “the leaders in multimedia.” Many of the most vibrant Amiga developers were forced to shift to the Macintosh or another platform by the lack of CD-ROM support. Of all Commodore’s failures, this one must loom among the largest. They allowed the Macintosh to become the platform most associated with the new era of CD-ROM-enabled multimedia computing without even bothering to contest the territory. The war was over before Commodore even realized a war was on.

Commodore’s feeble last gasp in terms of marketing CDTV positioned it as essentially an accessory to desktop Amigas, a “low-cost delivery system for multimedia” targeted at business and government rather than living rooms. The idea was that you could create presentations on Amiga computers, send them off to be mastered onto CD, then drag the CDTV along to board meetings or planning councils to show them off. In that spirit, a CDTV unit was reduced to a free toss-in if you bought an Amiga 3000 — two slow-selling products that deserved one another.

The final verdict on CDTV is about as ugly as they come: less than 30,000 sold worldwide in some eighteen months of trying; less than 10,000 sold in the American market Commodore so desperately wanted to break back into, and many or most of those sold at fire-sale discounts after the platform’s fate was clear. In other words, the 350 CDTV units that had been sold to the faithful at that first ebullient World of Amiga show made up an alarmingly high percentage of all the CDTV units that would ever sell. (Phillips, by contrast, would eventually manage to move about 1 million CD-I units over the course of about seven years of trying.)

The picture I’ve painted of the state of Commodore thus far is a fairly bleak one. Yet that bleakness wasn’t really reflected in the company’s bottom line during the first couple of years of the 1990s. For all the trouble Commodore had breaking new products in North America and elsewhere, their legacy products were still a force to be reckoned with outside the United States. Here the end of the Cold War and subsequent lifting of the Iron Curtain proved a boon. The newly liberated peoples of Eastern Europe were eager to get their hands on Western computers and computer games, but had little money to spend on them. The venerable old Commodore 64, pulling along behind it that rich catalog of thousands upon thousands of games of all stripes, was the perfect machine for these emerging markets. Effectively dead in North America and trending that way in Western Europe, it now enjoyed a new lease on life in the former Soviet sphere, its sales numbers suddenly climbing sharply again instead of falling. The Commodore 64 was, it seemed, the cockroach of computers; you just couldn’t kill it. Not that Commodore wanted to: they would happily bank every dollar their most famous creation could still earn them. Meanwhile the Amiga 500 was selling better than ever in Western Europe, where it was now the most popular single gaming platform of all, and Commodore happily banked those profits as well.

Commodore’s stock even enjoyed a brief-lived bubble of sorts. In the spring and early summer of 1991, with sales strong all over Europe and CDTV poised to hit the scene, the stock price soared past $20, stratospheric heights by Commodore’s recent standards. This being Commodore, the stock collapsed below $10 again just as quickly — but, hey, it was nice while it lasted. In the fiscal year ending on June 30, 1991, worldwide sales topped the magical $1 billion mark, another height that had last been seen in the heyday of the Commodore 64. Commodore was now the second most popular maker of personal computers in Europe, with a market share of 12.4 percent, just slightly behind IBM’s 12.7 percent. The Amiga was now selling at a clip of 1 million machines per year, which would bring the total installed base to 4.5 million by the end of 1992. Of that total, 3.5 million were in Europe: 1.3 million in Germany, 1.2 million in Britain, 600,000 in Italy, 250,000 in France, 80,000 in Scandinavia. (Ironically in light of the machine’s Spanish name, one of the few places in Western Europe where it never did well at all was Spain.) To celebrate their European success, Irving Gould and Mehdi Ali took home salaries in 1991 of $1.75 million and $2.4 million respectively, the latter figure $400,000 more than the chairman of IBM, a company fifty times Commodore’s size, was earning.

But it wasn’t hard to see that Commodore, in relying on all of these legacy products sold in foreign markets, was living on borrowed time. Even in Europe, MS-DOS was beginning to slowly creep up on the Amiga as a gaming platform by 1992, while Nintendo and Sega, the two big Japanese console makers, were finally starting to take notice of this virgin territory after having ignored it for so long. While Amiga sales in Europe in 1992 remained blessedly steady, sales of the Amiga in North America were down as usual, sales of the Commodore 64 in Eastern Europe fell off thanks to economic chaos in the region, and sales of Commodore’s line of commodity PC clones cratered so badly that they pulled out of that market entirely. It all added up to a bottom line of about $900 million in total earnings for the fiscal year ending on June 30, 1992. The company was still profitable, but considerably less so than it had been the year before. Everyone was now looking forward to 1993 with more than a little trepidation.

Even as Commodore faced an uncertain future, they could at least take comfort that their arch-enemy Atari was having a much worse time of it. In the very early 1990s, Atari enjoyed some success, if not as much as they had hoped, with their Lynx handheld game console, a more upscale rival to the Nintendo Game Boy. The Atari Portfolio, a genuinely groundbreaking palmtop computer, also did fairly well for them, if perhaps not quite as well as it deserved. But the story of their flagship computing platform, the Atari ST, was less happy. Already all but dead in the United States, the ST’s market share in Europe shrank in proportion to the Amiga’s increasing sales, such that it fell from second to third most popular gaming computer in 1991, trailing MS-DOS now as well as the Amiga.

Atari tried to remedy the slowing sales with new machines they called the STe line, which increased the color palette to 4096 shades and added a blitter chip to aid onscreen animation. (The delighted Amiga zealots at Amazing Computing wrote of these Amiga-inspired developments that they reminded them of “an Amiga 500 created by a primitive tribe that had never actually seen an Amiga, but had heard reports from missionaries of what the Amiga could do.”) But the new hardware broke compatibility with much existing software, and it only got harder to justify buying an STe instead of an Amiga 500 as the latter’s price slowly fell. Atari’s total sales in 1991 were just $285 million, down by some 30 percent from the previous year and barely a quarter of the numbers Commodore was doing. Jack Tramiel and his sons kept their heads above water only by selling off pieces of the company, such as the Taiwanese manufacturing facility that went for $40.9 million that year. You didn’t have to be an expert in the computer business to understand how unsustainable that path was. In the second quarter of 1992, Atari posted a loss of $39.8 million on sales of just $23.3 million, a rather remarkable feat in itself. Whatever else lay in store for Commodore and the Amiga, they had apparently buried old Mr. “Business is War.”

Still, this was no time to bask in the glow of sweet revenge. The question of where Commodore and the Amiga went from here was being asked with increasing urgency in 1992, and for very good reason. The answer would arrive in the latter half of the year, in the form at long last of the real, fundamental technical improvements the Amiga community had been begging for for so long. But had Commodore done enough, and had they done it in time to make a difference? Those questions loomed large as the 68000 Wars were about to enter their final phase.

(Sources: the book On the Edge: The Spectacular Rise and Fall of Commodore by Brian Bagnall; Amazing Computing of August 1987, June 1988, June 1989, July 1989, May 1990, June 1990, July 1990, August 1990, September 1990, December 1990, January 1991 February 1991, March 1991, April 1991, May 1991, June 1991, August 1991, September 1991, November 1991, January 1992, February 1992, March 1992, April 1992, June 1992, July 1992, August 1992, September 1992, November 1992, and December 1992; Info of July/August 1988 and January/February 1989; Amiga Format of July 1991, July 1995, and the 1992 annual; The One of September 1990, May 1991, and December 1991; CU Amiga of June 1992, October 1992, and November 1992; Amiga Computing of April 1992; AmigaWorld of June 1991. Online sources include Matt Barton’s YouTube interview with Jim Sachs,  Sébastien Jeudy’s interview with Carl Sassenrath, Greg Donner’s Workbench Nostalgia, and Atari’s annual reports from 1989, available on archive.org. My huge thanks to reader “himitsu” for pointing me to the last and providing some other useful information on Commodore and Atari’s financials during this period in the comments to a previous article in this series. And thank you to Reichart von Wolfsheild, who took time from his busy schedule to spend a Saturday morning with me looking back on the CDTV project.)

 
 

Tags: , , , , ,

Living Worlds of Action and Adventure, Part 1: The Atari Adventure

As regular readers of this blog are doubtless well aware, we stand now at the cusp not only of a new decade but also of a new era in terms of this history’s internal chronology. The fractious 1980s, marked by a bewildering number of viable computing platforms and an accompanying anything-goes creative spirit in the games that were made for them, are becoming the Microsoft-dominated 1990s, with budgets climbing and genres hardening (these last two things are not unrelated to one another). CD-ROM, the most disruptive technology in gaming since the invention of the microprocessor, hasn’t arrived as quickly as many expected it would, but it nevertheless looms there on the horizon as developers and publishers scramble to prepare themselves for the impact it must have. For us virtual time travelers, then, there’s a lot to look forward to. It should be as exciting a time to write about — and hopefully to read about — as it was to live through.

Yet a period of transition like this also tempts a writer to look backward, to think about the era that is passing in terms of what was missed and who was shortchanged. It’s at a time like this that all my vague promises to myself to get to this story or that at some point come home to roost. For if not now, when? In that light, I hope you’ll forgive me for forcing you to take one or two more wistful glances back with me before we stride boldly forward into our future of the past. There’s at least one more aspect of 1980s gaming, you see, that I really do feel I’d be remiss not to cover in much better detail than I have to this point: the astonishing, and largely British, legacy of the open-world action-adventure.

First, a little taxonomy to make sure we’re all on the same page. The games I want to write about take the themes and mechanics of adventure games — meaning text adventures during most of the era in question — and combine them with the graphics and input methods of action games; thus the name of “action-adventure.” Still, neither the name nor the definition conveys what audacious achievements the best of these games could be. On computers which often still had to rely on cassettes rather than disks for storage, which struggled to run even Infocom-level text adventures, almost universally young programmers, generally working alone or in pairs, proposed to create huge virtual worlds to explore — worlds which were to be depicted not in text but visually, running in organic, fluid real time. It was, needless to say, a staggeringly tall order. The programmers who tackled it did so because, being so young, they simply didn’t know any better. What’s remarkable is the extent to which they succeeded in their goals.

Which is not to say that games in this category have aged as well as, say, the majority of the Infocom catalog. Indeed, herein lies much of the reason that I’ve rather neglected these games to date. As all you regulars know by now, I place a premium on fairness and solubility in adventure-game design. I find it hard to recommend or overly praise games which lack this fundamental good faith toward their players, even if they’re wildly innovative or interesting in other ways.

That said, though, we shouldn’t entirely forget the less-playable games of history which pushed the envelope in important ways. And among the very most interesting of such interesting failures are many games of the early action-adventure tradition.

It’s not hard to pinpoint the reasons that these games ended up as they are. Their smoothly-scrolling and/or perspective-bending worlds make them hard to map, and thus hard for the player to methodically explore, in contrast to the grid-based movement of text adventures or early CRPGs. The dynamism of their worlds in contrast to those of those other genres leave them subject to all sorts of potentially game-wrecking emergent situations. Their young creators had no grounding in game design, were in fact usually far more interested in the world they were creating inside their primitive instruments than they were in the game they were asking their players to solve there. And in this era neither developers nor publishers had much of an inkling about the concept of testing. We should perhaps be more surprised that as many games of this stripe ended up as playable as they are than the reverse.

As I’ve admitted before, it’s inevitably anachronistic to return to these ancient artifacts today. In their own day, players were so awe-struck by these worlds’ very existence that they weren’t usually overly fixated on ending all the fun of exploring them with a victory screen. So, I’m going to relax my usual persnicketiness on the subject of fairness just a bit in favor of honoring what these games did manage to achieve. On this trip back through time, at least, let’s try to see what players saw back in the day and not quibble too much over the rest.

In this first article, we’ll go to the United States to look at the development of the very first action-adventure. It feels appropriate for such a beast to have started life as a literal translation of the original adventure game — Will Crowther and Don Woods’s Adventure — into a form manageable on the Atari VCS game console. The consoles aren’t my primary focus for this history, but this particular console game was just so important for future games on computers that my neglect of it has been bothering me for years.

After this article, we’ll turn the focus to Britain, the land where the challenge laid down by the Atari VCS Adventure was picked up in earnest, to look at some of the more remarkable feats of virtual world-building of the 1980s. And after that, and after looking back at one more subject that’s been sticking in his craw — more on that when the time comes — your humble writer here can start to look forward again from his current perch in the historical timeline with a clearer conscience.

A final note: I am aware that games of this type have a grand tradition of their own in Japan, which arrived on American shores through Nintendo Entertainment System titles like 1986’s The Legend of Zelda. I hope fans of such games will forgive me for neglecting them. My arguments for doing so are the usual suspects: that writing about them really would be starting to roam dangerously far afield from this blog’s core focus on computer gaming, that my own knowledge of them is limited to say the least, and that it’s not hard to find in-depth coverage of them elsewhere.

 

Adventure (1980)


 

Like so many others, Warren Robinett had his life changed by Will Crowther and Don Woods’s game of Adventure. In June of 1978, he was 26 years old and was working for Atari as a programmer of games for their VCS console, which was modestly successful at the time but still over a year removed from the massive popularity that would follow. More due to the novelty of the medium and upper management’s disinterest in the process of making games than any spirit of creative idealism, Atari at the time operated on the auteur model of videogame development. Programmers like Robinett were not only expected also to fill the role of designers — a role that had yet to be clearly defined anywhere as distinct from programming — but to function as their own artists and writers. To go along with this complete responsibility, they were given complete control of every aspect of their games, including the opportunity to decide what sorts of games to make in the first place. On Robinett’s first day of work, according to his own account, his new boss Larry Kaplan had told him, “Your job is to design games. Now go design one.” The first fruit of his labor had been a game called Slot Racers, a simple two-player exercise in maze-running and shooting that was very derivative of Combat, the cartridge that was bundled with every Atari VCS sold.

With Slot Racers under his belt, Robinett was expected, naturally, to come up with a new idea for his next game. Luckily, he already knew what he wanted to do. His roommate happened to work at the storied Stanford Artificial Intelligence Lab, and one day had invited him to drop by after hours to play a neat game called Adventure on the big time-shared computers that lived there. Robinett declared it to be “the coolest thing I’ve ever seen.” He decided that very night that he wanted to make his next project an adaptation of Adventure for the Atari VCS.

On the face of it, the proposition made little sense. My description of Robinett’s Slot Racers as derivative of Combat begins to sound like less of a condemnation if one considers that no one had ever anticipated the Atari VCS being used to run games that weren’t built, as Combat and Slot Racers had been, from the simple-minded raw material of the earliest days of the video arcades. The machine’s designers had never, in other words, intended it to go much beyond Pong and Breakout. Certainly the likes of Adventure had never crossed their minds.

Adventure consisted only of text, which the VCS wasn’t terribly adept at displaying, and its parser accepted typed commands from a keyboard, which the VCS didn’t possess; the latter’s input mechanism was limited to a joystick with a single fire button. The program code and data for Adventure took more than 100 K of storage space on the big DEC PDP-10 computer on which it ran. The Atari VCS, on the other hand, used cartridge-housed ROM chips capable of storing a program of a maximum of 4 K of code and data, and boasted just 128 bytes — yes, bytes — of memory for the volatile storage of in-game state. In contrast to a machine like the PDP-10 — or for that matter to just about any other extant machine — the VCS was shockingly primitive to program. There not being space enough to store the state of the screen in those 128 bytes, the programmer had to manually control the electron beam which swept left to right and top to bottom sixty times per second behind the television screen, telling it where it should spray its blotches of primary colors. Every other function of a game’s code had to be subsidiary to this one, to be carried out during those instants when the beam was making its way back to the left side of the screen to start a new line, or — the most precious period of all — as it moved from the end of one round of painting at the bottom right of the screen back to the top left to start another.

The creative freedom that normally held sway at Atari notwithstanding, Robinett’s bosses were understandably resistant to what they viewed as his quixotic quest. Undeterred, he worked on it for the first month in secret, hoping to prove to himself as much as anyone that it could be done.

It was appropriate in a way that it should have been Warren Robinett among all the young programmers at Atari who decided to bring to the humble VCS such an icon of 1970s institutional computing —  a rarefied environment far removed from the populist videogames, played in bars and living rooms, that were Atari’s bread and butter. Almost all of the programmers around him were self-taught hackers, masters of improvisation whose code would have made any computer-science professor gasp in horror but whose instincts were well-suited to get the most out of the primitive hardware at their disposal. Robinett’s background, however, was very different. He brought with him to Atari a Bachelor’s Degree in computer science from Rice University and a Master’s from the University of California, Berkeley, and along with them a grounding in the structure and theory of programming which his peers lacked. He was, in short, the perfect person at Atari to be having a go at this project. While he would never be one of the leading lights of the programming staff in terms of maximizing the VCS’s audiovisual capabilities, he knew how to design the data structures that would be necessary to make a virtual world come to life in 4 K of ROM and 128 bytes of RAM.

Robinett’s challenge, then, was to translate the conventions of the text adventure into a form that the VCS could manage. This naturally entailed turning Crowther and Woods’s text into graphics — and therein lies an amusing irony. In later years, after they were superseded by various forms of graphic adventures, text adventures would come to be seen by many not so much as a legitimate medium in themselves as a stopgap, an interim way to represent a virtual world on a machine that didn’t have the capability to display proper graphics. Yet Robinett came to this, the very first graphic adventure, from the opposite point of view. What he really wanted to do was to port Crowther and Woods’s Adventure in all its textual glory to the Atari VCS. But, since the VCS couldn’t display all that text, he’d have to find a way to make do with crude old graphics.

The original Adventure, like all of the text adventures that would follow, built its geography as a topology of discrete “rooms” that the player navigated by typing in compass directions. In his VCS game, Robinett represented each room as a single screen. Instead of typing compass directions, you move from room to room simply by guiding your avatar off the side of a screen using the joystick: north becomes the upper boundary of the screen, east the right-hand boundary, etc. Robinett thus created the first VCS game to have any concept of a geography that spanned beyond what was visible on the screen at any one time. The illustration below shows the text-adventure-like map he crafted for his world.

It’s important to note, though, that even such a seemingly literal translation of a text adventure’s geography to a graphical game brought with it implications that may not be immediately obvious. Most notably, your avatar can move about within the rooms of Robinett’s game, a level of granularity that its inspiration lacks; in Crowther and Woods’s Adventure, you can be “in” a “room” like “End of Road” or “Inside Building,” but the simulation of space extends no further. The effect these differences have on the respective experiences can be seen most obviously in the two games’ approaches to mazes. Crowther and Woods’s (in)famous “maze of twisty little passages” is built out of many individual rooms; the challenge comes in charting the one-way interconnections between them all. In Robinett’s game, however, the mazes — there are no less than four of them for the same reason that mazes were so common in early text adventures: they’re cheap and easy to implement — are housed within the rooms, even as they span multiple rooms when taken in their entirety.

Crowther and Woods’s maze of twisty little passages, a network of confusing room interconnections where going north and then going south usually won’t take you back to where you started.

Warren Robinett’s graphical take on the adventure-game maze; it must be navigated within the rooms as well as among them. The dot at left is the player’s avatar, which at the moment is carrying the Enchanted Chalice whose recovery is the goal of the game.

Beyond the challenges of mapping its geography, much of Crowther and Woods’s game revolves around solving a series of set-piece puzzles, usually by using a variety of objects found scattered about within the various rooms; you can pick up such things as keys and lanterns and carry them about in your character’s “inventory” to use elsewhere. Robinett, of course, had to depict such objects graphically. To pick up an object in his game, you need simply bump into it with your avatar; to drop it you push the fire button. Robinett considered trying to implement a graphical inventory screen for his game, but in the end chose to wave any such tricky-to-implement beast away by only allowing your avatar to carry one item at a time. Similarly, the “puzzles” he placed in the game, such as they are, are all simple enough that they can be solved merely by bringing an appropriate object into the vicinity of the problem. Opening a locked gate, for instance, requires only that the player walk up to it toting the appropriate key; ditto attacking a dragon with a sword.  By these means, Robinett pared down the “verbs” in his game to the equivalent of the text parser’s movement commands, its “take” and “drop” commands, and a sort of generic, automatically-triggered “use” action that took the place of all the rest of them. (Interestingly, the point-and-click, non-action-oriented graphical adventures that would eventually replace text adventures on the market would go through a similar process of simplification, albeit over a much longer stretch of time, so that by the end of the 1990s most of them too would offer no more verbs than these.)

The text-to-graphics adaptations we’ve seen so far would, with the exception only of the mazes, seem to make of Robinett’s Adventure a compromised shadow of its inspiration, lacking not only its complexity of play but also, thanks to the conversion of Crowther and Woods’s comparatively refined prose to the crudest of graphics, its flavor as well. Yet different mediums do different sorts of interactivity well. Robinett managed, despite the extreme limitations of his hardware, to improve on his inspiration in certain ways, to make some aspects of his game more complicated and engaging in compensation for its simplifications. Other than the mazes, the most notable case is that of the other creatures in the world.

The world of the original Adventure isn’t an entirely uninhabited place — it includes a dwarf and a pirate who move about the map semi-randomly — but these other actors play more the role of transitory annoyances than that of core elements of the game. With the change in medium, Robinett could make his other creatures play a much more central role. Using an approach he remains very proud of to this day, he gave his four creatures — three dragons and a pesky, object-stealing bat, an analogue to Crowther and Woods’s kleptomaniacal pirate — “fears” and “desires” to guide their movements about the world. With the addition of this basic artificial intelligence, his became a truly living world sporting much emergent possibility: the other creatures continue moving autonomously through it, pursuing their own agendas, whether you’re aware of them or not. When you do find yourself in the same room/screen as one of the dragons, you had best run away if you don’t have the sword. If you do, the hunted can become the hunter: you can attempt to kill your stalker. These combat sequences, like all of the game, run in real time, another marked contrast with the more static world of the Crowther and Woods Adventure. Almost in spite of Robinett’s best intentions, the Atari VCS’s ethos of action-based play thus crept into his staid adventure game.

Robinett’s Adventure was becoming a game with a personality of its own rather than a crude re-implementation of a text game in graphics. It was becoming, in other words, a game that maximized the strengths of its medium and minimized its weaknesses. Along the way, it only continued to move further from its source material. Robinett had originally planned so literal a translation of Crowther and Woods’s Adventure that he had sought ways to implement its individual puzzles; he remembers struggling for some time to recreate his inspiration’s “black rod with a rusty star on the end,” which when waved in the right place creates a bridge over an otherwise impassable chasm. In the end, he opted instead to create a movable bridge object which you can pick up and carry around, dropping it on walls to create passages. Robinett:

Direct transliterations from text to video format didn’t work out very well. While the general idea of a videogame with rooms and objects seemed to be a good one, the graphic language of the videogame and the verbal language of the text dialogue turned out to have significantly different strengths. Just as differences between filmed and live performance caused the art form of cinema to slowly diverge from its parent, drama, differences between the medium of animated graphics and the medium of text have caused the animated adventure game to diverge from the text-adventure game.

So, Robinett increasingly turned away from direct translation in favor of thematic analogues to the experience of playing Adventure in text form. To express the text-based game’s obsession with lighted and dark rooms and the lantern that turns the latter into the former, for instance, he included a “catacombs” maze where only the few inches immediately surrounding your avatar can be seen.

But even more radical departures from his inspiration were very nearly forced upon him. When Robinett showed his work-in-progress, heretofore a secret, to his management at Atari, they liked his innovations, but thought they could best be applied to a game based on the upcoming Superman movie, for which Atari had acquired a license. Yet Robinett remained wedded to his plans for a game of fantasy adventure, creating no small tension. Finally another programmer, John Dunn, agreed to adapt the code Robinett had already written to the purpose of the Superman game while Robinett himself continued to work on Adventure. Dunn’s game, nowhere near as complex or ambitious as Robinett’s but nevertheless clearly sporting a shared lineage with it, hit the market well before Adventure, thereby becoming the first released Atari VCS game with a multi-screen geography. Undaunted, Robinett soldiered on to finish creating the new genre of the action-adventure.

The Atari Adventure‘s modest collection of creatures and objects. The “cursor” represents the player’s avatar. Its shape was actually hard-coded into the Atari VCS, where it was intended to represent the ball in a Pong-like game — a telling sign of the only sorts of games the machine’s creators had envisioned being run on it. And if you think the dragons look like ducks, you’re not alone. Robinett never claimed to be an artist…

Ambitious though it was in contrast to Superman, his graphic-based adventure game of 4 K must be inevitably constrained in contrast to a text-based game of more than 100 K. He wound up with a world of about 30 rooms — as opposed to the 130 rooms of his inspiration — housing a slate of seven totable objects: three keys, each opening a different gate; a sword for fighting the dragons; the bridge; a magnet that attracts to it other objects that may be inaccessible directly; and the Enchanted Chalice that you must find and return to the castle where you begin the game in order to complete it. (Rather than the fifteen treasures of Crowther and Woods’s Adventure, Robinett’s game has just this one.)

The constraints of the Atari VCS ironically allowed Robinett to avoid the pitfall that dogs so many later games of this ilk: a tendency to sprawl out into incoherence. Adventure, despite or perhaps because of its primitiveness, is playable and soluble, and can be surprisingly entertaining even today. To compensate for both his constrained world and the youngsters who formed the core of Atari’s customers, Robinett designed the game with three selectable modes of play: a simplified version for beginners and/or the very young, a full version, and a version that scattered all of the objects and creatures randomly about the world to create a new challenge every time. This last mode was obviously best-suited for players who had beaten the game’s other modes, for whom it lent the $25 cartridge a welcome modicum of replayability.

Thanks to the efforts of Jason Scott and archive.org, Adventure can be played today in a browser. Failing that, the video below, prepared by Warren Robinett for his postmortem of the Atari Adventure at the 2015 Game Developers Conference, shows a speed run through the simplified version of the game — enough to demonstrate most of its major elements.


Robinett finished his Adventure in early 1979, about two years after Crowther and Woods’s game had first taken institutional computing by storm and about eight months after he’d begun working on his videogame take on their concept. (True to his role of institutional computing’s ambassador to the arcade, he’d spent most of that time working concurrently on what seemed an even more impossible task: a BASIC programming system for the Atari VCS, combining a cartridge with a pair of hardware controllers that together formed an awkward keyboard.) From the beginning right up to the date of its release, his game’s name remained simply Adventure, nobody apparently ever having given any thought to the confusion this could create among those familiar with Crowther and Woods’s game. Frustrated by Atari’s policy of giving no public credit to the programmers who created their games, one of Robinett’s last additions was a hidden Easter egg, one of videogaming’s first. It took the form of a secret room housing the only text in this game inspired by a text adventure, spelling out the message “Created by Warren Robinett.” Unhappy with his fixed salary of about $22,000 per year, Robinett left Atari shortly thereafter, going on to co-found The Learning Company, a pioneer in educational software. The first title he created there, Rocky’s Boots, built on many of the techniques he’d developed for Adventure, although it ran on an Apple II computer rather than the Atari VCS.

In the wake of Robinett’s departure, Atari’s marketing department remained nonplussed by this unusually complex and cerebral videogame he had foisted on them. Preoccupied by Atari’s big new game for the Christmas of 1979, a port of the arcade sensation Asteroids, they didn’t even release it until June of 1980, more than a year after Robinett had finished it. Yet the late release date proved to be propitious, coming as it did just after the Atari VCS’s first huge Christmas season, when demand for games was exploding and the catalog of available games was still fairly small. Adventure became something of a sleeper hit, selling first by random chance, plucked by nervous parents off of patchily-stocked store shelves, and then by word of mouth as its first players recognized what a unique experience it really was. Robinett claims it wound up selling 1 million copies, giving the vast majority of that million their very first taste of a computer-based adventure game.

For that reason, the game’s importance for our purposes extends far beyond that of being just an interesting case study in converting from one medium to another. A long time ago, when this blog was a much more casual affair than it’s since become, I wrote these words about Crowther and Woods’s Adventure:

It has long and rightfully been canonized as the urtext not just of textual interactive fiction but of a whole swathe of modern mainstream videogames. (For example, trace World of Warcraft‘s lineage back through Ultima Online and Richard Bartle’s original MUD and you arrive at Adventure.)

I’m afraid I rather let something fall by the wayside there. Robinett’s Adventure, the first of countless attempts to apply the revolutionary ideas behind Crowther and Woods’s game to the more mass-market-friendly medium of graphics, is in fact every bit as important to the progression outlined above as is MUD. [1]As for MUD: don’t worry, I have plans to round it up soon as well.

That said, my next couple of articles will be devoted to charting the game’s more immediate legacy: the action-adventures of the 1980s, which would borrow heavily from its conventions and approaches. While the British programmers we’ll be turning to next had at their disposal machines exponentially more powerful than Robinett’s Atari VCS, they expanded their ambitions exponentially to match. Whether considered as technical or imaginative feats, or both, the action-adventures to come would be among the most awe-inspiring virtual worlds of their era. If you grew up with these games, you may be nodding along in agreement right now. If you didn’t, you may be astonished at how far their young programmers reached, and how far some of them managed to grasp despite all the issues that should have stopped them in their tracks. But then, in this respect too they were only building on the tradition of Warren Robinett’s Adventure.

(Sources: the book Racing the Beam: The Atari Video Computer System by Nick Montfort and Ian Bogost; Warren Robinett’s chapter “Adventure as a Video Game: Adventure for the Atari 2600″ from The Game Design Reader: A Rules of Play Anthology; Next Generation of January 1998; Warren Robinett’s Adventure postmortem from the 2015 Game Developers Conference; Robinett’s interview from Halcyon Days. As noted in the article proper, you can play Robinett’s Adventure in your browser at archive.org.)

Footnotes

Footnotes
1 As for MUD: don’t worry, I have plans to round it up soon as well.
 
 

Tags: ,

A Tale of the Mirror World, Part 8: Life After Tetris

Alexey Pajitnov and Henk Rogers meet with Tetris licensees EA Mobile in 2015.

As the dust settled from the battle over Tetris and a river of money started flowing back to the Soviet Union, the Soviet Academy of Sciences made a generous offer to Alexey Pajitnov. In acknowledgment of his service to the state, they told him, they would buy him an IBM PC/AT — an obsolete computer in Western terms but one much better than the equipment Pajitnov was used to. This would be the only tangible remuneration he would ever receive from them for making the most popular videogame in the history of the world.

But the Soviet Union was changing rapidly, and Pajitnov intended to change with it. He formed a little for-profit company — such a thing was now allowed in the Soviet Union — called Dialog with a number of his old friends and colleagues from the Moscow Computer Center, among them his friendly competitor in game-making, Dmitry Pavlovsky, and the first of many psychologists who would be fascinated by the Tetris Effect, Vladimir Pokhilko. The name of the company was largely a reflection of a pet project cooked up by Pajitnov and Pokhilko, a sort of cross between Eliza and Alter Ego with the tentative name of Biographer. “This kind of software can help you to understand your life and change it,” Pajitnov believed. “This will help people. This is what I would call a ‘constructive game.'” Such a game would be, needless to say, a dramatic departure from Tetris, demanding far more time to develop than had that exercise in elegant minimalism.

In the meantime, though, Tetris was huge. Henk Rogers, now well-established as Pajitnov’s mentor in the videogame business, advised him strongly to make a sequel, and to do it soon — for if you don’t strike soon, he told him, someone else will. Pajitnov therefore came up with a game he called Welltris, a three-dimensional Tetris in which the shapes fell down into a “well” — thus the name — which the player viewed from above. While Welltris lacked the immediate, obvious appeal of Tetris, some players would come to love it for its subtle complexity. Pajitnov had no problem selling it to Spectrum Holobyte in North America, who despite their affiliation with the hated Mirrorsoft had managed by dint of luck and cleverness to remain in his good graces. In Europe, the game was picked up by the French publisher Infogrames. It was released for personal computers on both continents before 1989 was through.

Befitting his growing celebrity status, Alexey Pajitnov himself featured in the background graphics of Welltris.

Pajitnov greeted a new year and a new decade by taking his first trip to the United States. Paid for by Spectrum Holobyte, whose resources were limited, it was an oddly austere promotional junket for the designer of the most popular videogame in the world. He made the trip alone but for a translator to help out with his still-broken English. His first stop was Las Vegas, for the Winter Consumer Electronics Show — quite the introduction to American excess! He sat gaping in wonder at the food piled in mounds before him at the hotel’s $3.69 all-you-can-eat buffet, flicked his “I Love Vegas” cigarette lighter, and remarked, “So this is a typical American city.” After Vegas, he traveled around the country in what a Boston Globe reporter described as “almost an underground manner,” “apartment to apartment, computer friend to computer friend.” But he was hardly a man of expensive tastes anyway; out of all the food he ate on the trip, it was Kentucky Fried Chicken he liked best.

Spectrum Holobyte did spring for a lavish press reception in San Francisco’s St. Francis Hotel. The day after that event, he made time for a “U.S./Soviet Personal Computer Seminar” at San Francisco State University out of his desire to “grow up the game life in Moscow. I want to help, with advice.” He visited Minoru Arakawa and Howard Lincoln at Nintendo of America’s headquarters in Seattle; got to see in person paintings at the Museum of Modern Art and the Metropolitan in New York which he’d known only from the books in his parents’ library; visited MIT’s Media Lab to view cutting-edge research into virtual reality, trying not to compare the technology that surrounded him there too closely to the spartan desk he had to share with others back at the Moscow Computer Center. And between all these glimpses of American life, he gave interview after interview to an endless stream of journalists eager to get their whack at one of the great current human-interest stories. He made time for everyone, from the slick reporters from the major newspapers and magazines to the scruffy nerds from the smallest of the trade journals. One and all treated him as living proof of the changing times, a symbol of the links that were being forged between East and West in this new, post-Cold War order.

His odyssey wound up in Hawaii with Henk Rogers, swimming and kayaking and drinking mai tais. The two friends were a long, long way from the gray streets of Moscow where they had met, but the bond they had forged there endured. Over drinks one gorgeous starlit evening, Rogers asked Pajitnov if he would be interested in leaving the Soviet Union permanently to work on games in the West. Torn between the wonders he had just seen and everyone’s natural love for the place he came from, he could only shrug for now: “I do not have an answer for that question.”

Rogers had had good reason for asking it. Ever ambitious, he had used the first influx of cash from Tetris to establish a new branch of Bullet-Proof Software in Seattle, conveniently close to Nintendo of America’s headquarters. Broadly speaking, his intention was to do in the North American Nintendo market what he had been doing in Japan: find games in other countries and on other platforms that would work well on the Nintendo Entertainment System and/or the Game Boy, license the rights, and port them over. His first big North American release, a puzzle game called Pipe Dream that had already been a hit on home computers in Europe, would do very well on the NES and Game Boy as well.

Yet Rogers was also eager to do original games with Pajitnov. He had passed on Welltris, whose 3D graphics were a little more than the Nintendo machines were realistically capable of, but kept cajoling Pajitnov to come up with yet another, more Nintendo-friendly Tetris variant. The result was Hatris, where the falling shapes of Tetris were replaced with falling hats which had to be stacked atop one another according to style. Although it presaged the later craze for matching games in the casual-game market even more obviously than had Tetris, it wasn’t all that great a game in its own right. Even on his American publicity tour, when it was still in the works, Pajitnov described it without a lot of enthusiasm. The sub-genre he had created was already in danger of being run into the ground.

Wordtris

But the industry, inevitably, was just getting started. The next several years would bring heaps more variations on the Tetris template, a few of them crediting their design fully to Pajitnov, some of them crediting him more vaguely for the “concept,” some of them not crediting him at all. Some ran on computers, some ran on consoles from Nintendo and others. Some were very playable, some less so. Personally, I have a soft spot for 1991’s Wordtris, a game designed by two of Pajitnov’s Russian partners at Dialog where you have to construct words, Scrabble-style, out of falling letters. In addition to its other merits, it became another casual pioneer, this time of the sub-genre of word-construction games. But then, I’m far better at verbal puzzles than spatial ones, so my preference for the wordy Wordtris should perhaps be taken with a grain of salt.

Despite all the industry’s enthusiasm for Tetris-like games, Pajitnov and Pokhilko’s plans for an Eliza killer came to naught. Publishers were always willing to use Pajitnov’s name to try to sell one more falling-something game, but didn’t think it had much value attached to high-concept fare like Biographer.

In 1991, Pajitnov finally answered in the affirmative the question Rogers had posed to him on that evening in Hawaii; he and his family immigrated to San Francisco. Cutting ties with Dialog back in a Soviet Union that was soon to be known simply as Russia again, he formed a design partnership with Pokhilko, who soon joined him in the United States. Over the next several years, the two created a variety of simple puzzle games, some more and some less Tetris-like, for various publishers, along with at least one truly outré concept, the meditative, non-competitive “aquarium simulator” El-Fish.

Meanwhile the times were continuing to change back in Russia, as piece after piece of the state-owned economy was privatized. Among the entities that were spun off as independent businesses was ELORG. Nikoli Belikov, savvy as ever, ended his career in the bureaucracy by becoming the owner and chief executive of the new ELORG LLC.

Henk Rogers had all but promised Pajitnov shortly after they had met in Moscow that, although he might not be able to secure a Tetris royalty for him right away, he would take care of him in the long run. He had indeed looked out for him ever since — and now he was about to deliver the ultimate prize. He came to Belikov with a proposal. Belikov still didn’t know much about the videogame business, he said, but he did. In return for a 50 percent stake in Tetris, he would take over the management of what was by now not so much a videogame as a global brand. Belikov agreed, and Rogers and Pajitnov together formed The Tetris Company in 1996 to manage the Western stake. And so at last Alexey Pajitnov started getting paid — and paid very well at that — for his signature creation.

For Rogers, protecting Tetris represented an almost unique challenge. More than virtually any other videogame, the genius of Tetris is in the concept; the implementation is trivial in comparison, manageable by any reasonably competent programmer within a few weeks. And, indeed, the public-domain and shareware software communities in the West had been flooded with clones and variants almost from the moment the game had first appeared on Western computers in 1988, just as had been the land behind the Iron Curtain in the years prior to that. No more, said Rogers. He hired a team of lawyers to go after anyone and everyone who made a game of falling somethings without the authorization of The Tetris Company, attacking with equal prejudice those who tried to sell their versions and those — often beginning game programmers who were merely proud to show off their first creations — who shared them for free. His efforts created no small uproar on the Internet of the late 1990s, leaving him to take plenty of heat as, as he once put it himself, “the jerk behind The Tetris Company.”

To this day, the bulk of The Tetris Company’s time and energy is devoted to the relentless policing of their intellectual property. As one would expect, Rogers and company have tended to draw the broadest possible line around what constitutes an infringing Tetris clone. The location of the actual line between legality and illegality, however, remains curiously unresolved. The Tetris Company has always had a lot of money and a lot of lawyers to hand, and no one has ever dared engage them in a legal battle to the death over the issue.

In addition to managing The Tetris Company, Henk Rogers continued to run Bullet-Proof Software throughout the decade of the 1990s. The first half of that period was marked by a number of successful non-Tetris titles, such as the aforementioned Pipe Dream, but over time Bullet-Proof increasingly dedicated themselves to churning out permutation after permutation on a game that many would argue had been born perfect: Tetris 2, Tetris Blast, V-Tetris, Tetris S, Tetris 4D. By decade’s end, they were running out of names. Screw it, they said in 1998, we’ll just call the next TetrisThe Next Tetris.

Bullet-Proof closed up shop shortly after that game, but Rogers formed Blue Lava Wireless in 2002 to make games for the first wave of feature phones. Their most successful titles by far were… you guessed it, mobile versions of Tetris. Indeed, the convergence of Tetris with mobile phones drove a second boom that proved just as profitable as the first, Game Boy-driven wave of mobile success.

Having long since sold Blue Lava, Rogers lives the good life today in his first geographical love of Hawaii, running the Blue Planet Foundation, which has the laudable goal of ending the use of carbon-based fuels in Hawaii and eventually all over the world; he also oversees a commercial spinoff of the foundation’s research called Blue Planet Energy. He’s still married to the girl who tempted him to move to Japan all those years ago. And yes, he’s a very, very rich man, still making millions every year from Tetris.

Alexey Pajitnov has continued to kick around the games industry, plying his stock-in-trade as a designer of simple but (hopefully) addictive puzzle games and enjoying his modest celebrity as the man who made Tetris. He spent several years at Microsoft, where he was responsible for titles like The Microsoft Puzzle Collection and Pandora’s Box; some of the puzzles found therein were new, but others were developed contemporaneously with the original Tetris during all those long days and nights at the Moscow Computer Center. He’s slowed down a bit since 2000, but keeps his hand in with an occasional mobile game, which his reputation is always sufficient to see published. In light of his ongoing design work, it was perhaps a bit unkind of a 2012 IGN article to call him one of the games industry’s “one-hit wonders.” Still, Tetris was so massively important that it was all but an inevitability that it would overshadow every other aspect of his career. Pajitnov, for his part, seems to have made his peace with that, humoring the wide-eyed reporters who continue to show up to interview him at his current home near Seattle. It may have taken Tetris a long time to pay off for him, but he can have no complaints about the rewards it brings him today: he is, like Henk Rogers, very, very rich. The two men remain close friends as well as business partners in The Tetris Company; the warm relationship forged over vodka on that cold Moscow night back in February of 1989 continues to endure.

Bosom buddies from Day One: Alexey Pajitnov and Henk Rogers in Moscow, 1989.

Nikoli Belikov, the most unlikely person ever to become a millionaire thanks to videogames, finally cashed in his Tetris chips in 2005, selling his stake in the game to The Tetris Company for $15 million. Doing business in his country remained as complicated as ever at that time. The only difference was that Belikov now had to fear the Russian Mafia finding out about his windfall rather than the KGB and the other entrenched forces of the old communist system. In an episode that had much the same spy-movie flavor as the original Tetris negotiations, Belikov and Rogers signed the deal and exchanged funds in Panama. The former then went back home to Moscow to enjoy a retirement worthy of an oligarch.

The other people Alexey Pajitnov left behind in Russia weren’t quite so fortunate. Dialog, the company he had helped found there before emigrating, collapsed soon after the departure of their star designer. And with the end of Dialog ended the game-design career of Dmitry Pavlovsky, who went back to other computer-science pursuits.

Vadim Gerasimov, the third member of the little game-making collective that had spawned Tetris, had parted ways with Pajitnov even before the brief-lived Dialog experiment. The most idealistic hacker in the Russian camp, the circus that sprang up around the Tetris rights had never sat well with him. He claims that at some point after ELORG got involved in the negotiations Pajitnov came to him with a paper to sign. It stated, according to him, “that I agree to only claim porting Tetris to the PC, agree to give Pajitnov the right to handle all business arrangements, and refuse any rewards related to Tetris. I did not entirely agree with the content, but I trusted Alexey and signed the paper anyway.” Gerasimov had been given credit right after Pajitnov himself as the “original programmer” of Tetris in the Mirrorsoft and Spectrum Holobyte versions, but by the time the Nintendo versions appeared his name had been scrubbed from the game. To my knowledge, he never made any money at all from writing the first version of Tetris to reach beyond the Iron Curtain. While I hesitate to condemn Pajitnov or anyone else too roundly for that state of affairs — after all, even Pajitnov himself wasn’t paid for many years for his game — it does strike me as unfortunate that Gerasimov was allowed, whether consciously or accidentally, so completely to slip through the cracks. Had he not lent Pajitnov his talents, it’s highly unlikely that the game would ever have become more than a curiosity enjoyed by a handful of people in and around the Moscow Computer Center.

Gerasimov does evince a tinge of bitterness when he speaks about the subject, but, to his credit, he hasn’t let it consume his life. Instead he’s made a fine career for himself, immigrating to the United States, earning a doctorate from MIT, and finally winding up in Australia, where he works for Google today as a software engineer. He claims not to agree with The Tetris Company’s policy of so zealously protecting the property — although his position should perhaps be considered with a certain skepticism in light of the fact that he has never been in a position to benefit from that protection.

It was Pajitnov’s good friend and frequent design partner Vladimir Pokhilko who came to by far the worst end among the Russians who were there to witness Tetris‘s birth. Having cut his business ties with Pajitnov after the latter took a staff job with Microsoft in 1996, he tried to make a go of it in Silicon Valley, but bet on all the wrong horses. Beset by financial problems and rumored entanglements with the Russian Mafia, in 1998 he murdered his wife and son with a hammer and a hunting knife, then cut his own throat. His good-natured comment to his buddy at the Moscow Computer Center in 1984 — “I can’t live with your Tetris anymore!” — suddenly took on a different dimension in the aftermath. Pajitnov, so garrulous on most subjects, clams up when the topic turns to Pokhilko, the old friend who was obviously hiding a profound darkness behind his cheerful smile. “We [were] always friends, colleagues, and partners with good and warm relations,” he says, and leaves it at that — as shall we.


 

Pokhilko wasn’t the only character in the Tetris story to come to a bad end. Sometime during the early morning of November 5, 1991, Robert Maxwell jumped or fell off the deck of his luxury yacht into the sea near the Canary Islands. His body was discovered by a passing fishing boat the next day.

When executors and regulators began to look closely at Maxwell’s personal finances and those of his vaunted publishing empire in the aftermath of his death, what they found appalled them. He had racked up more than $3 billion in debts, and had been stealing from his employees’ pension funds and making countless other illegal transactions in order to shore up both his business interests and his own lavish lifestyle. Pathologists hadn’t been able to agree on a definite cause of death after his battered body was recovered. In light of this, and in light of his political and financial entanglements all over the world, not to mention the accusations of espionage that have occasionally dogged him, conspiracy theories abound about his fate. One suspects, however, that the more prosaic explanations are the more likely: either he deliberately threw himself into the water to escape the financial reckoning he knew was coming, or, being grossly overweight, he had a heart attack while taking in the view and simply fell into the water. There is, however, one other oddity about the whole thing to reckon with: his body was completely naked when it was recovered. It’s highly doubtful that the mystery of Robert Maxwell’s death will ever be solved beyond a shadow of doubt.

The strange circumstances of Robert Maxwell’s death caused a sensation in the British tabloid press.

In the wake of the Maxwell empire’s collapse, Kevin Maxwell filed for the biggest personal bankruptcy in British history, writing off more than $1 billion in debts. He was taken to trial for conspiracy to commit fraud for his involvement with his father’s house of cards, but was acquitted. His business record since has been checkered, encompassing another huge bankruptcy and repeated accusations of malfeasance of various stripes.

Like most of Robert Maxwell’s properties, Mirrorsoft was sold off in the scandal that followed his death. The operation wound up in the hands of Acclaim Entertainment, but the name disappeared forever.

Over in the United States, Spectrum Holobyte managed to live considerably longer. Phil Adam and Gilman Louie, the partners who had long run the publisher, pulled together enough venture capital in the midst of the Maxwell empire’s collapse to buy their complete independence. They then went on another buying spree as the merry 1990s got going in earnest, picking up both the computer-game publisher MicroProse and the American division of Henk Rogers’s Bullet-Proof Software in 1993. Combined with the good relations they enjoyed with Rogers and Pajitnov, the latter purchase seemingly left them well-positioned to continue to exploit Tetris for years to come. But they had extended themselves too far too quickly, picking up a mountain of debt in the process. Caught out in between the first and the second great Tetris booms, they were never quite able to turn the corner into reliable profitability. Hasbro Interactive bought the troubled company in 1998, and the name of Spectrum Holobyte also vanished into history.


 

Robert Stein lost his rights to Tetris on personal computers in 1990, when ELORG terminated his license due to his ongoing failure to pay them in a timely manner. Stein did cite an excuse for his tardiness this time, claiming that Mirrorsoft had simply stopped paying him for their sub-license altogether amid the brouhaha of 1989. Since Spectrum Holobyte was paying their royalties to Mirrorsoft, who were supposed to then pass them on to Stein, such a refusal would have meant that Stein himself would have received nothing at all to pass on to the Russians. Regardless of the full truth of the matter, Stein was forced out of the picture, and Spectrum Holobyte negotiated their own license directly with ELORG in order to continue making their version of Tetris.

Stein lost his last remaining Tetris deal, for the arcade rights, in 1992. Once again, ELORG cited non-payment as their reason for terminating the contract, and once again Stein claimed that his sub-licensee — this time Atari Games — wasn’t paying him.

Years before this event, Stein’s little company Andromeda Software, thoroughly unequipped to compete in the evolving international videogame market, had ceased to exist as anything other than a paper entity. With the termination of his last ELORG deal ended his time in games. I don’t know what he’s been doing in the years since, but he is alive and apparently well.

A recent still of Robert Stein from the documentary Moleman 4.

To this day, however, Stein remains deeply embittered against virtually every other character in the story of Tetris. His own narrative is at odds with that of the other principals on a number of key points. He confesses only to naiveté and perhaps the occasional bout of carelessness, never to deliberate wrongdoing. In his telling, the story of February and March of 1989 is that of a premeditated conspiracy, orchestrated by Henk Rogers, to steal Tetris away from him. While he admits to having earned about $150,000 to $200,000 from Tetris — no mean total in the context of most videogames of its era — he’s clearly haunted by all the tens of millions earned by Rogers, Pajitnov, and Belikov. “Tetris made enemies out of friends and corrupted people left, right, and center,” he says.


 

The war between the two Ataris and Nintendo raged on long after the issue of the Tetris rights was decided in November of 1989.

In May of 1989, Atari Games had launched still another lawsuit against Nintendo, this time alleging them to have violated their patent on an “apparatus for scrolling a video display” from 1979. Meanwhile Nintendo continued to put the squeeze on Atari at retail, threatening to cut off stores who dared to stock the Tengen games. Atari’s Dan Van Elderen claims that eventually all fifteen of the largest retail chains in the country dropped Tengen in the face of such pressure.

Atari found a friendly ear for their tale of woe in United States Congressman Dennis Eckart. The Democrat from Ohio was the chairman of a subcommittee focused on antitrust enforcement, and he was already inquiring into Nintendo’s business practices when he was contacted by Van Elderen. Van Elderen and other Atari executives became star witnesses in building the case for an official government investigation into Nintendo, while Eckart never even contacted anyone from the opposing camp to ask for their side of the story. On December 7, 1989, the 48th anniversary of the attack on Pearl Harbor — the timing struck very few as coincidental — Eckart held a press conference to announce his recommendation that the Justice Department launch a probe of Nintendo’s role in the videogame market. As Howard Lincoln would later note, the press conference couldn’t have favored Atari’s position more had the latter written the script — which, in light of the cozy relationship that had sprung up between Eckart and Atari, there was grounds to suspect they had. The Justice Department soon handed the case to the Federal Trade Commission.

While the FTC investigated, Eckart continued to speak out against the Menace from Japan, blending long-established generational hysteria against videogames in general with Japophobia. Like many in the software industry, his greatest fear was that Nintendo would turn the NES into a real home computer, as they were trying to do to the Famicom in Japan, and use it to take over the entire market for consumer software. Think of the children, he thundered: “How did they [Nintendo] get in American homes? They enticed their way in through our children’s hearts. If you turn a toy into a computer, what’s the next step?”

Under mounting pressure from several sides, Nintendo eased their licensing conditions somewhat on October 22, 1990, allowing some licensees to begin manufacturing their own game cartridges. They also dropped the provisions from their standard licensing agreements that restricted their licensees from releasing their games on rival platforms. Many publishers would admit privately that the removal of this specific language from the contract changed little in actuality — “I’m not going to make games for competing systems because we know that Nintendo would get even, one way or another,” said one — but it did do much to nullify the primary charge in Jack Tramiel’s Atari Corporation’s case against Nintendo: that not allowing licensees to release a game on a rival platform for two years after its appearance on the NES constituted an abuse of monopoly. Some suspected that the changes at Nintendo had been the result of a deal with the FTC which allowed them to avoid some charges of engaging in anti-competitive practices.

Although Nintendo was clearly moderating some of their stances in response to recent developments, Atari Games still couldn’t manage to get a win in Judge Fern Smith’s courtroom. On the contrary: on March 27, 1991, she handed them another devastating defeat. A forensic study of the code used in the lockout-defeat mechanism employed by the Tengen cartridges having shown it to be virtually identical to Nintendo’s own, Judge Smith excoriated Atari for their actions in words that left no doubt about her opinion of their business ethics. She certainly made no effort to sugarcoat her verbiage: “Atari lied to the Copyright Office in order to obtain the copyrighted program,” she bluntly wrote. After having killed the Tengen Tetris fifteen months before, she now ordered all unauthorized Tengen games for the NES to be pulled from the market.

On April 10, 1991, the FTC announced that they had charged Nintendo with price-fixing in light of the latter’s policy of demanding that retailers not discount their products. But, anxious to avoid another ugly public legal battle, Nintendo had agreed to a settlement which required them to discontinue the policy in question, to pay $4.75 million to cover the government’s administrative costs in conducting the investigation, and to send consumers who the government claimed had been negatively affected a $5 coupon good for future Nintendo purchases. And with that, the FTC’s Nintendo probe was effectively finished. Everyone agreed that Nintendo had gotten off rather shockingly easy in being allowed to turn what should have been the negative press of a major judgment against them into what amounted to a new sales promotion. It almost seemed like someone at the FTC had a soft spot for them. Both Ataris blasted the easy treatment of Nintendo in the press, indelicately implying that something underhanded must be going on between Nintendo and the FTC.

But brighter news for the Atari camp did come the very next day. Atari Games had bitterly contested Judge Smith’s injunction stating that they couldn’t sell their Tengen games on the NES unless and until their appeal of her most recent ruling was concluded in their favor. They claimed the injunction could very well drive them out of business before that day arrived, making their ongoing appeal moot. On April 11, the appeals court agreed to give them back their right to sell the Tengen games while the legal proceedings ground on.

Whatever the situation in court, the two Ataris could feel fairly confident that they were at the very least holding their own in the public-relations war. In January of 1992, Michael Crichton summed up the mood of many inside and outside the American government with his novel Rising Sun. A thinly disguised polemic against Japan — Crichton himself described the book as a “wake-up call” to his country about the Japanese threat — it centers on a fictional corporation called Nakamoto whose mysterious leader sits far away in Japan at the center of a web of collusion and corruption. It was hard not to see the parallels with Nintendo’s greatly-feared-but-seldom-seen president Hiroshi Yamauchi. Price-fixing and other forms of collusion are “normal procedure in Japan,” says one of Crichton’s characters. “Collusive agreements are the way things are done.”

Just months after the publication of the novel, Yamauchi confirmed all of its worst insinuations in the eyes of some when he bought the Seattle Mariners baseball team. The Japanese, it seemed, were taking over even the Great American Pastime. What was next, Mom and apple pie? Major League Baseball approved the sale only on the condition that the day-to-day management of the team remain in American hands.

Hoping to capitalize on the political sentiment that so often painted Nintendo as a dangerous foreign invader, Tramiel’s Atari Corporation elected to take their $250 million lawsuit against Nintendo to a jury trial. But whatever abuses Nintendo may have committed, they had been smart enough not to give Atari any smoking guns in the form of written documentation of their more questionable policies. When Minoru Arakawa took the stand, he faced a barrage of aggressive accusations from Atari’s lawyers.

Isn’t it a fact that if Atari or Sega was also being carried, the salesman would go in and say they won’t be able to carry Nintendo?

Did Nintendo ever tell any licensee that they could only make games for Nintendo?

Did Nintendo ever tell them, the licensees, that if they put their games on any other system they would be penalized?

That Nintendo would reduce their allocation of chips during the chip shortage?

Cancel trade-show space?

Any threats to prevent them from making games for other home-videogame systems?

To every question, Arakawa gave a one-syllable answer: “No.”

On May 1, 1992, the verdict came back. The jury did acknowledge that Nintendo had enjoyed a de facto monopoly over the American console market at the time Atari had filed their suit, but just having a monopoly is not illegal in itself. The jury found that Atari hadn’t managed to prove that Nintendo had abused their monopoly power. Atari Corporation would elect not to appeal the verdict.

On September 10, 1992, Atari Games lost their appeal to Judge Smith’s ruling against them of the previous year, and her overturned injunction against them went into permanent effect. All Tengen games on Nintendo’s platforms were to be pulled from store shelves and destroyed, effective immediately. Tengen and Nintendo had been parted forever.

With these last two rulings, the war entered the mopping-up phase, the final result all but a foregone conclusion. In 1990, the delicate stock-balancing act that had allowed Hideyuki Nakajima to run Atari Games as an independent entity had collapsed, and Time Warner had assumed control. The latter was skeptical from the beginning of this war that felt like it had at least as much to do with pride and legacy as it did with sound business strategy. Following these latest setbacks, they pressed Nakajima hard to cut his losses and settle the remaining legal issues. Atari Games and Nintendo announced a closed settlement agreement on March 24, 1994, that put the last of the litigation to bed, presumably at the cost of some number of millions from Atari.

Shortly thereafter, Atari Games ceased to exist under that name. On April 11, 1994, Time Warner went through a restructuring which saw Atari and Tengen subsumed into the preexisting subsidiary Time Warner Interactive. With the arcade market slowly dying and Nintendo certainly not likely to let them back onto their platforms anytime soon, the storied name of Atari had become a liability rather than an asset.

Atari Corporation came to a similarly dispiriting end. After years of creeping irrelevancy brought on by the slow decline of their ST line of computers and the more dramatic failure of the Atari Jaguar, a quixotic last-ditch effort to launch a game console to compete directly with Nintendo, the remnants of the company were scooped up by JTS Storage, a maker of hard disks, on July 30, 1996. Thanks to the financial contortions that were used to bring off the deal, the transaction was counter-intuitively recorded as an acquisition of JTS by Atari, but there was no doubt on the scene about who was really acquiring whom. Like Time Warner, JTS saw little remaining value to the Atari name; they had acquired Atari Corporation for nothing more nor less than a pile of cold hard cash the company had recently collected after winning a patent-infringement judgment against Sega, not in the hope of making Atari mean something again to a new generation of gamers for whom the name’s glory days were ancient history. But the money didn’t do them much good; JTS went bankrupt in 1999.

In a previous article, I called the war between the two Ataris and Nintendo the past of videogames versus their future. As we can now see, that description is almost literally true. The two Ataris could perhaps console themselves that they had forced some changes in Nintendo’s behavior, but they had paid a Pyrrhic price for those modest tactical victories. After the war was over, both Ataris died while Nintendo thrived. Winning the war so utterly became one of the proudest achievements of Howard Lincoln, that unapologetically vindictive master strategist. “Lincoln’s motto was ‘fuck with us and we will destroy you,'” said one of Nintendo’s lawyers. “Otherwise he’s a really nice guy.”

Yet the full import of the war extended far beyond its importance to the individual combatants: it marked a watershed moment for the way that software is sold. Buried in the text of Judge Smith’s 1991 ruling against Atari Games was the statement that legitimized the future not just of console-based videogames but of much of the rest of the consumer-software market. Irrespective of the shady methods Atari had employed to violate Nintendo’s patent in the case at hand, Judge Smith affirmed that Nintendo did have the abstract right to “exclude others and reserve to itself, if it chooses” complete control of the Nintendo cartridge market. This statement essentially reversed an established precedent, dating back to an antitrust case that was decided against IBM in 1969, that a hardware manufacturer could not decide what software was allowed to run on their machines. With the legal cover Judge Smith provided, what had once been shocking enough to set most of the American software industry up in arms soon became routine. Every successful console that would follow the NES, from Nintendo or anyone else, would use the walled-garden model. Even more significantly, virtually every significant new software market to arise in the future, such as the mobile-phone and tablet markets that thrive today, would also be a walled garden controlled by a single corporation. Today, the un-walled marketplace for personal-computer software has become the exception rather than the rule, a shambling ghost from the past which no corporation has yet been able to corral. And long may it shamble on, for it continues to provide a haven in interactive media for the experimental, the controversial, the iconoclastic, and the esoteric — all the things the walled gardens reject.

The anti-Japanese, anti-Nintendo sentiment in the country, which had threatened to reach xenophobic levels in some circles, gradually faded as the decade wore on and Nintendo lost some of their standing as the be-all, end-all in videogames. Their much-feared strategy of using the NES as a Trojan Horse to take over all of consumer computing never really got off the ground. The enhancements that turned the Famicom into a full-fledged computer had never done tremendously well in Japan, and the Nintendo Network there turned into one of the company’s rare outright failures, never getting beyond the tens of thousands of subscribers. A survey found that the biggest source of consumer resistance to the idea was, ironically, Nintendo’s established reputation in videogames. People just weren’t excited about using what they thought of as their children’s toys to manage their stock portfolios. In light of these setbacks in Japan, Nintendo never introduced either the “computery” hardware enhancements they had tried on the Famicom or the Nintendo Network to North America. They instead elected to content themselves with their lot as the biggest company in videogames, much to the relief of the American software industry.

But even in the field of videogames, Nintendo wouldn’t stand alone for much longer. Sega had introduced their Genesis console in North America already at the end of 1989. Following a slow start, it eventually turned into a viable competitor for the aging NES. At mid-decade, Sony arrived with the PlayStation as a third major player in the game-console space. While both Sega and Sony adopted Nintendo’s walled-garden approach to software, one could no longer claim that videogaming writ large in North America lived by the whims of a single company. Yes, one could still be unhappy that all three popular console-sellers in the United States were Japanese — another successful born-in-America console wouldn’t arrive until the Microsoft Xbox in 2001 — but even that concern faded somewhat with the tech boom of the mid- and late-1990s. In this market, there was plenty of room for everyone, even the shifty-eyed foreigners.

Minoru Arakawa resigned as president of Nintendo of America in January of 2002. When not enjoying semi-retirement, he has since worked with Henk Rogers and Alexey Pajitnov on various projects related to Tetris.

Howard Lincoln was appointed CEO of the Seattle Mariners in 1999, signaling a scaling back in his involvement with Nintendo proper. He continued to run the team until 2016. His tenure produced no triumphs to compete with his great victory over the two Ataris; the team made the playoffs the first two years with him at the helm, but never again after that.

Minoru Arakawa and Howard Lincoln at the Interactive Achievement Awards in 2007, where they accepted lifetime-achievement awards.

In 2003, the North American arm of the French publisher Infogrames re-christened themselves Atari. Any hopes they might have had to revive the name’s glory days were, however, sadly disappointed. After years of lurching from crisis to crisis, the new Atari filed for Chapter 11 bankruptcy in 2013. Today a skeleton staff makes casino games under the name.

As for Nintendo… well, Nintendo remains Nintendo, of course. They’ve long since surpassed their old rival Atari to become the most iconic name in videogames. Once, parents would say that their children liked to “play Atari,” regardless of what name happened actually to be printed on the front of their game console. Now, they say their children like to “play Nintendo” — and, thanks largely to what Tetris first began, they’ll often say that they “play Nintendo” themselves. Nintendo has had ups and downs over the years, but have on the whole remained ridiculously successful thanks to the same old blending of strategic smarts with a deep well of ruthlessness ready to be employed when they judge the situation to call for it. Some variant or another of Tetris — usually more than one — has been a staple of every Nintendo machine since the NES and Game Boy.


 

And so we’re left with only one fate left to describe: that of the place where we began this journey, the Mirror World of the Soviet Union.

The dawn of the 1990s was an extraordinary time in the often fraught history of East/West relations. The opening of the Soviet Union brought with it the expectation that the world was witnessing the dawn of a new economic superpower even as a military superpower fell. Freed from the yoke of communism, Russia seemingly had everything going for it. It was a sprawling land bursting with natural resources, with an educated population eager to shed their isolation and become a part of the free world’s economic and political order. This was the period when Francis Fukuyama was writing The End of History, claiming that with the end of the Cold War free societies and free markets had won history’s argument, leaving humanity with nothing left to do but enjoy their fruits.

Few industries were more excited about jumping through the mirror and doing business in the lands beyond than the computer industry. As relations improved between West and East, the restrictions implemented by the Coordination Committee on Export Controls in the West were gradually eased. Already by 1988, it had been permitted to sell 16-bit microprocessors like the 80286 to the Soviet Union; by 1990, 32-bit processors like the 80386 were also allowed; by 1991, the restrictions as a whole were no more. Conferences and seminars sprung up, places for Western business executives to meet with former Eastern government bureaucrats newly thrust into the same role by their countries’ privatizing economies.

Of course, those Westerners peering eagerly through the mirror still had their work cut out for them in lots of ways. Most of the Eastern European economies were in complete disarray, with devalued currencies and thus little hard cash for buying Western products.

But where there’s a desire to do business, there’s usually a way. Some enterprising Western exporters resorted to complicated three-party deals. The would-be computer exporter would give their machines to an agent, who would send them on to Eastern Europe in exchange for raw goods. The agent would then sell the goods back in the West, and give the computer exporter a chunk of the profits. By 1992, an 80286-based PC in Russia cost about $1100. This was certainly better than the $17,000 one could have expected to pay for a shoddy Apple II clone nine years before, even if that kind of money would buy you a much more powerful 80386-based machine in the United States.

Issues of Byte magazine from the early 1990s buzz with excitement about the opportunities awaiting citizens of the Mirror World, not to mention those in the West who planned to guide them down the tricky paths of capitalism. “These are people who have felt useless — useless — all their lives!” said American business pundit Esther Dyson of the masses getting their first taste of freedom. “Do you know what it is like to feel useless all your life? Computers are turning many of these people into entrepreneurs. They are creating the entrepreneurs these countries need.”

And yet, sadly, the picture of a Russian entrepreneur in the popular imagination of the West of today is that of a mobster. Under the benighted stewardship of Boris Yeltsin, the high hopes for the Russia of the early 1990s gave way to the economic chaos of the mid- and late-decade years, paving the way for one Vladimir Putin. The new Russia proved unable to overcome the culture of corruption that had become so endemic during the old Soviet Union’s Brezhnev era.

I’m hardly qualified to provide a detailed analysis of why it has been so hard for Russia to escape its tragic past, any more than you are likely up for reading such a thing at the end of this already lengthy article. Since moving to Europe in 2009 and continuing to be subjected to my fellow Americans’ blinkered notions of what is “wrong” here and how it should be fixed, I’ve reluctantly concluded that the only way to really know a place may be to live there. So, in lieu of flaunting my ignorance on this subject I’ll just provide a few final anecdotes from my trip across Russia back in 2002.

  1. Driving around Moscow with several other backpackers and a guide we’d scraped together enough money to hire, I noticed some otherwise unmarked cars sporting flashing lights on the roof, of the sort that the driver can reach up through the window to set in place as she drives. There were a lot of these cars, all rushing about purposefully with lights undulating like mad. Was there really that much crime in the city? Or was some sort of emergency in progress? Weird if so, as the cars with the flashing lights by no means all seemed to be headed in the same direction. Curious about all these things, I finally asked our guide. “Oh, most of those cars aren’t actually police,” she said. “If you pay the right person, you can get a light like that for your personal car. Then you don’t have to stop at the traffic lights.”
  2. This being the period just after George W. Bush had looked into Putin’s eyes and seen clear to his “straightforward and trustworthy” soul, I was interested to hear what ordinary Russians thought of their new leader. To a person, the Russians I talked to found the West’s hopes that Putin would prove an enlightened steward of his people hilarious. “Putin was KGB during the Soviet times,” they told me. “Do you not understand what that means?”
  3. In all my life’s travels, I’ve never witnessed a more cash-driven economy than the Russia of 2002. Many of the Russians I met said they would accept their salary from their workplaces only in cash. Life savings were hoarded inside mattresses and behind paintings, for, after the economic collapse at the end of Yeltsin’s reign which had cost many Russians their previous life savings, banks were a bad joke, a sucker’s game. There were no ATMs outside of Moscow, nowhere to cash checks. On the plus side, you could buy most things with American dollars if you ran out of Russian rubles. Indeed, these were vastly preferable to many Russians, being the engine that made the underground economy go.
  4. I was single at the time, and families kept wanting to show me their daughters, with an obvious eye to marriage and a ticket to the West. Luckily, I was old enough to know to keep my distance from what could have been some very dangerous entanglements, although it was kind of fun for an average-looking guy like me to be fawned over like fitness-model material for a while.
  5. When being shown around Vladivostok at the end of my trip — and it is a very lovely city — I asked our guide about some huge mansions built into the hills surrounding the harbor and visible from many spots in the city proper. She seemed scared to say too much about them. “That’s where the Mafia lives, we don’t go up there” was all I could get out of her. “We” in this context apparently meant ordinary, non-Mafia Russian citizens like her.
  6. Despite such social disparities, few Russians evinced much nostalgia for the Yeltsin era, when they had enjoyed more political freedom than at any other point in their country’s history. Instead many preferred to cast their nostalgic gaze further back, to the Soviet era. Back then, they’d had security, they told me: a paid education, a guaranteed job, paid medical care, a rent-controlled apartment for their family once their name came to the top of the waiting list. If Putin could provide them with those things again, they weren’t overly inclined to quibble about issues like free speech and voting rights. They’d seen what they thought of as democracy up close and personal during the 1990s, and it hadn’t been a terribly pleasant experience for them. The West could keep it as far as they were concerned.

Perhaps somewhere in the intersection of these anecdotes can be found some clues as to what went wrong with the dreams for a healthy, stable, and free Russia.

Today Putin revels in his role of the comic-book evil mastermind, gobbling up territory here, hacking elections there, scheming always to undermine the existing world order and sometimes seeming to succeed at it to a disconcerting degree. Like most “strong-man” leaders, he tells himself and his people that he does these things in the name of nationalism and ethnic pride. Yet the would-be strong man fails to understand that by embracing the role of the geopolitical pariah, by running his country as a criminal enterprise with himself at the top of the oligarchial food chain, he actually turns Russia into a weak nation when it could be such a strong one. The largest country in the world has a gross domestic product less than that of South Korea, and 7 percent that of the United States, the nation it so desperately wishes to challenge again on the world stage. Now that the Iron Curtain no longer blocks their way, far too many of the best and the brightest in Russia flee to the West, leaving behind a generation of often hopeless men to literally drink themselves to death; the average life expectancy for a man in Russia is 64 years. The country remains what it has been for centuries: the greatest example of wasted potential on earth.

The Moscow Computer Center as it looks today.

The storied Moscow Computer Center still exists inside the changed Russia, under the official name of the Dorodnicyn Computing Center of the Russian Academy of Sciences. But even in 2004, when the BBC filmed there for a documentary about Tetris, its luster was yet more faded than it had been during Alexey Pajitnov’s time there. Yuri Yevtushenko, the director of the place at the time, painted a rather grim picture: “Our institute is getting older. The average age of the staff is fifty, and I’m afraid that in ten years if this continues without good young support we will cease to exist.” Working there paid a wage of $200 per month — hardly much enticement for the next generation of talented young Russian hackers. His analysis of the Computer Center’s future prospects could stand in for those of his country: “In Russia, the most widespread strategy is the ‘perhaps’ strategy. It has often saved us. During wars, at the beginning everything looks hopeless. Any other country would probably have been destroyed and died, but Russia somehow finds a way to pull through and survive. I hope it will be the same here too.”

Yes, hope must live on. The Putin era too shall pass, and Russia will perhaps in time get another chance to realize its potential.

The way a narrative history like this one reads has always been a function of where it begins and ends — what the historian Hayden White calls “emplotment.” Writing history at the wrong time can be intellectually dangerous, as Francis Fukuyama, who has become a walking punch line in the wake of all the history that has transpired since his The End of History, can doubtless well attest. The thing about history, for good and, yes, sometimes for ill, is that it just keeps on happening. Maybe, then, I’ll someday be able to write a less melancholic ending for this tale of the Mirror World. The people of Russia certainly deserve one.

(Sources: the books Game Over: How Nintendo Conquered the World by David Sheff The Tetris Effect: The Game That Hypnotized the World by Dan Ackerman, and Rising Sun by Michael Crichton; the BBC television documentary From Russia with LoveSTart of June 1990; GamePro of December 1990; Computer Gaming World of September 1993; Byte of September 1990, January 1991, January 1992, and September 1992; Boston Globe of January 30 1990; San Francisco Examiner of September 24 1998. Online sources include Vadim Gerasimov’s personal Tetris recollections; Tetris Pressures Game Act-Alikes” from Wired; “The People Versus Mario” from Muckrock; “The Mystery of Maxwell’s Death” from The Independent; “Russian Men Losing Years to Vodka” from The Guardian; “Off the Grid” from Hawaii Business; “The Man Who Made Tetris from Motherboard. And one more big thank you to Peter Sovietov for sharing his knowledge of Soviet and Russian computing with me.)

 
 

Tags: , , , , ,