RSS

Defender of the Crown

Defender of the Crown

If you rushed out excitedly to buy an Amiga in the early days because it looked about to revolutionize gaming, you could be excused if you felt just a little bit disappointed and underwhelmed as the platform neared its first anniversary in shops. There was a reasonable amount of entertainment software available — much of it from the Amiga’s staunchest supporter, Electronic Arts — but nothing that felt quite as groundbreaking as EA’s early rhetoric about the Amiga would imply. Even the games from EA were mostly ports of popular 8-bit titles, modestly enhanced but hardly transformed. More disappointing in their way were the smattering of original titles. Games like Arcticfox and Marble Madness had their charms, but there was nothing conceptually new about them. Degrade the graphics and sound just slightly and they too could easily pass for 8-bit games. But then, timed to neatly correspond with that one-year anniversary, along came Defender of the Crown, the Amiga’s first blockbuster and to this day the game many old-timers think of first when you mention the platform.

Digital gaming in general was a medium in flux in the mid-1980s, still trying to understand what it was and where it fit on the cultural landscape. The preferred metaphor for pundits and developers alike immediately before the Amiga era was the book; the bookware movement brought with it Interactive Fiction, Electronic Novels, Living Literature, and many other forthrightly literary branded appellations. Yet in the big picture bookware had proved to be something of a commercial dud. Defender of the Crown gave the world a new metaphorical frame, one that seemed much better suited to the spectacular audiovisual capabilities of the Amiga. Cinemaware, the company that made it, had done just what their name would imply: replaced the interactive book with the interactive movie. In the process, they blew the doors of possibility wide open. In its way Defender of the Crown was as revolutionary as the Amiga itself — or, if you like, it was the long-awaited proof of concept for the Amiga as a revolutionary technology for gaming. All this, and it wasn’t even a very good game.

The Cinemaware story begins with Bob Jacob, a serial entrepreneur and lifelong movie buff who fulfilled a dream in 1982 by selling his business in Chicago and moving along with his wife Phyllis to Los Angeles, cradle of Hollywood. With time to kill while he figured out his next move, he became fascinated with another, newer form of media: arcade and computer games. He was soon immersing himself in the thriving Southern California hacker scene. Entrepreneur that he was, he smelled opportunity there. Most of the programmers writing games around him were “not very articulate” and clueless about business. Jacob realized that he could become a go-between, a bridge between hackers and publishers who assured that the former didn’t get ripped off and that the latter had ready access to talent. He could become, in other words, a classic Hollywood agent transplanted to the brave new world of software. Jacob did indeed became a modest behind-the-scenes player over the next couple of years, brokering deals with the big players like Epyx, Activision, Spinnaker, and Mindscape for individuals and small development houses like Ultrasoft, Synergistic, Interactive Arts, and Sculptured Software. And then came the day when he saw the Amiga for the first time.

Jacob had gotten a call from a developer called Island Graphics, who had been contracted by Commodore to write a paint program to be available on Day One for the Amiga. But the two companies had had a falling out. Now Island wanted Jacob to see if he could place the project with another publisher. This he succeeded in doing, signing Island with a new would-be Amiga publisher called Aegis; Island’s program would be released as Aegis Images. (Commodore would commission R.J. Mical to write an alternate paint program in-house; it hit the shelves under Commodore’s own imprint as GraphiCraft.) Much more important to Jacob’s future, however, was his visit to Island’s tiny office and his first glimpse of the prototype Amigas they had there. Like Trip Hawkins and a handful of others, Jacob immediately understood what the Amiga could mean for the future of gaming. He understood so well, in fact, that he made a life-changing decision. He decided he wanted to be more than just an agent. Rather than ride shotgun for the revolution, he wanted to drive it. He therefore wound down his little agency practice in favor of spearheading a new gaming concept he dubbed “Cinemaware.”

Jacob has recounted on a number of occasions the deductions that led him to the Cinemaware concept. A complete Amiga system was projected to cost in the neighborhood of $2000. Few of the teenagers who currently dominated amongst gamers could be expected to have parents indulgent enough to spend that kind of money on them. Jacob therefore expected the demographic that purchased Amigas to skew upward in age — toward people like him, a comfortably well-off professional in his mid-thirties. And people like him would not only want, as EA would soon be putting it, “the visual and aural quality our sophisticated eyes and ears demand,” but also more varied and nuanced fictional experiences. They would, in other words, like to get beyond Dungeons and Dragons, The Lord of the Rings, Star Wars, and Star Trek as the sum total of their games’ cultural antecedents. At the same time, though, their preference for more varied and interesting ludic fictions didn’t necessarily imply that they wanted games that were all that demanding on their time or even their brainpower. This is the point where Jacob diverged radically from Infocom, the most prominent extant purveyor of sophisticated interactive fictions. The very first computer game that Jacob had ever bought had been Infocom’s Deadline. He hadn’t been all that taken with the experience even at the time. Now, what with its parser-based interface and all the typing that that entailed, its complete lack of audiovisual flash, its extensive manual and evidence reports that the player was expected to read before even putting the disk in the drive, and the huge demands it placed on the player hoping to actually solve its case, it served as a veritable model for what Jacob didn’t want his games to be. Other forms of entertainment favored by busy adults weren’t so demanding. Quite the opposite, in fact. His conception of adult gaming would have it be as easy-going and accessible as television. Thus one might characterize Jacob’s vision as essentially Trip Hawkins’s old dictum of “simple, hot, and deep,” albeit with a bit more emphasis on the “hot” and a bit less on the “deep.” The next important question was where to find those more varied and nuanced fictional experiences. For a movie buff living on the very doorstep of Tinsel Town, the answer must have all but announced itself of its own accord.

Bookware aside, the game industry had to some extent been aping the older, more established art form of film for a while already. The first attempt that I’m aware of to portray a computer game as an interactive movie came with Sierra’s 1982 text-adventure epic Time Zone, the advertising for which was drawn as a movie poster, complete with “Starring: You,” “Admission: $99.95,” and a rating of “UA” for “Ultimate Adventure.” It was also the first game that I’m aware of to give a credit for “Producer” and “Executive Producer.” Once adopted and popularized by Electronic Arts the following year, such movie-making terminology spread quickly all over the game industry. Now Bob Jacob was about to drive the association home with a jackhammer.

Each Cinemaware game would be an interactive version of some genre of movies, drawn from the rich Hollywood past that Jacob knew so well. If nothing else, Hollywood provided the perfect remedy for writer’s block: “Creatively it was great because we had all kinds of genres of movies to shoot for.” Many of the movie genres in which Cinemaware would work felt long-since played-out creatively by the mid-1980s, but most gaming fictions were still so crude by comparison with even the most hackneyed Hollywood productions that it really didn’t matter: “I was smart enough and cynical enough to realize that all we had to do was reach the level of copycat, and we’d be considered a breakthrough.”

Cynicism notwithstanding, the real, obvious love that Jacob and a number of his eventual collaborators had for the movies they so self-consciously evoked would always remain one of the purest, most appealing things about Cinemaware. Their manuals, scant and often almost unnecessary as they would be, would always make room for an affectionate retrospective on each game’s celluloid inspirations. At the same time, though, we should understand something else about the person Jacob was and is. He’s not an idealist or an artist, and certainly not someone who spends a lot of time fretting over games in terms of anything other than commercial entertainment. He’s someone for whom phrases like “mass-market appeal” — and such phrases tend to come up frequently in his discourse — hold nary a hint of irony or condescension. Even his love of movies, genuine as it may be, reflects his orientation toward mainstream entertainment. You’ll not find him waiting for the latest Criterion Collection release of Bergman or Truffaut. No, he favors big popcorn flicks with, well, mass-market appeal. Like so much else about Jacob, this sensibility would be reflected in Cinemaware.

Financing for a new developer wasn’t an easy thing to secure in the uncertain industry of 1985. Perhaps in response, Jacob initially conceived of his venture as a very minimalist operation, employing only himself and his wife Phyllis on a full-time basis. The other founding member of the inner circle was Kellyn Beeck, a friend, software acquisitions manager at Epyx, fellow movie buff, and frustrated game designer. The plan was to give him a chance to exorcise the latter demon with Cinemaware. Often working from Jacob’s initial inspiration, he would provide outside developers with design briefs for Cinemaware games, written in greater or lesser detail depending on the creativity and competency of said developers. When the games were finished, Jacob would pass them on to Mindscape for publication as part of the Cinemaware line. One might say that it wasn’t conceptually all that far removed from the sort of facilitation Jacob had been doing for a couple of years already as a software agent. It would keep the non-technical Jacob well-removed from the uninteresting (to him) nuts and bolts of software development. Jacob initially called his company Master Designer Software, reflecting both an attempt to “appeal to the ego of game designers” and a hope that, should the Cinemaware stuff turn out well, he might eventually launch other themed lines. Cinemaware would, however, become such a strong brand in its own right in the next year or two that Jacob would end up making it the name of his company. I’ll just call Jacob’s operation “Cinemaware” from now on, as that’s the popular name everyone would quickly come to know it under even well before the official name change.

After nearly a year of preparation, Jacob pulled the trigger on Cinemaware at last in January of 1986, when in a matter of a few days he legally formed his new company, signed a distribution contract with Mindscape, and signed contracts with outsiders to develop the first four Cinemaware games, to be delivered by October 15, 1986 — just in time for Christmas. Two quite detailed design briefs went to Sculptured Software of Salt Lake City, a programming house that had made a name for themselves as a porter of games between platforms. Of Sculptured’s Cinemaware projects, Defender of the Crown, the title about which Jacob and Beeck were most excited, was inspired by costume epics of yesteryear featuring legendary heroes like Ivanhoe and Robin Hood, while SDI was to be a game involving Ronald Reagan’s favorite defense program and drawing its more tenuous cinematic inspiration from science-fiction classics ranging from the Flash Gordon serials of the 1930s to the recent blockbuster Star Trek II: The Wrath of Khan. The other two games went to proven lone-wolf designer/programmers, last of a slowly dying breed, and were outlined in much broader strokes. King of Chicago, given to a programmer named Doug Sharp who had earlier written a game called ChipWits, an interesting spiritual successor to Silas Warner’s classic Robot War, was to be an homage to gangster movies. And Sinbad and the Throne of the Falcon was given to one Bill Williams, who had earlier written such Atari 8-bit hits as Necromancer and Alley Cat and had just finished the first commercial game ever released for the Amiga, Mind Walker. His game would be an homage to Hollywood’s various takes on the Arabian Nights. Excited though he was by the Amiga, Jacob hedged his bets on his platforms just as he did on his developers, planning to get at least one title onto every antagonist in the 68000 Wars before 1986 was out. Only Defender of the Crown and Sinbad were to be developed and released first on the Amiga; King of Chicago would be written on the Macintosh, SDI on the Atari ST. If all went well, ports could follow.

All of this first wave of Cinemaware games as well as the ones that would follow will get their greater or lesser due around here in articles to come. Today, though, I want to concentrate on the most historically important if certainly not the best of Cinemaware’s works, Defender of the Crown.

Our noble Saxon hero on the job

Our noble Saxon hero on the job.

Defender of the Crown, then, takes place in a version of medieval England that owes far more to cinema than it does to history. As in romantic depictions of Merry Olde England dating back at least to Walter Scott’s Ivanhoe, the stolid English Saxons are the heroes here, the effete French Normans — despite being the historical victors in the struggle for control of England — the villains. Thus you play a brave Saxon lord struggling against his Norman oppressors. Defender of the Crown really doesn’t make a whole lot of sense as history, fiction, or legend. A number of its characters are drawn from Ivanhoe, which might lead one to conclude that it’s meant to be a sequel to that book, taking place after Richard I’s death has thrown his kingdom into turmoil once again. But if that’s the case then why is Reginald Front-de-Boeuf, who was killed in Ivanhoe, running around alive and well again? Should you win Defender of the Crown, you’ll be creating what amounts to an alternate history in which the Saxons throw off the Norman yoke and regain control of England. Suffice to say that the only history that Defender of the Crown is really interested in is the history of Hollywood. What it wants to evoke is not the England of myth or reality, but the England of the movies so lovingly described in its manual. It has no idea where it stands in relation to Ivanhoe or much of anything else beyond the confines of a Hollywood sound stage, nor does it care. Given that, why should we? So, let’s agree to just go with it.

The core of Defender of the Crown: Risk in Merry Olde England

The core of Defender of the Crown: Risk played in Merry Olde England

Defender of the Crown is essentially Risk played on a map of England. The other players in the game include three of the hated Normans and two other Saxon lords, who generally try to avoid attacking their ethnic fellows unless space starts getting really tight. Your goal is of course to wipe the Normans from the map and make of England a Saxon kingdom again. Woven into the simple Risk-like strategy game are a handful of action-oriented minigames that can be triggered by your own actions or those of the other lords: a grand jousting tournament, a midnight raid on an enemy castle, a full-on siege complete with a catapult that you use to knock down a beleaguered castle’s walls. In keeping with Jacob’s vision of Cinemaware games as engaging but light entertainments, a full game usually takes well under an hour to play, and there is no provision for saving or restoring.

From the beginning, it was Jacob’s intention to really pull out all the stops for Defender of the Crown in particular amongst his launch titles, to make of it an audiovisual showcase the likes of which had never been seen before. Shortly after signing Sculptured Software to do the programming, he therefore signed Jim Sachs to work with them, giving him a title familiar to Hollywood but new to the world of games: Art Director.

A Jim Sachs self-portrait

A Jim Sachs self-portrait, one of his early Amiga pictures that won him the job of Art Director for Defender of the Crown.

A self-taught artist from childhood and a programmer since he’d purchased a Commodore 64 just a few years before, Sachs had made quite a name for himself in quite a short time in Commodore circles. He’d written and released a game of his own for the Commodore 64, Saucer Attack, that mixed spectacular graphics with questionable gameplay (an accusation soon to be leveled against Defender of the Crown as well). He’d then spent a year working on another game, to be called Time Crystal, that never got beyond a jawdropping demo that made the rounds of Commodore 64 BBSs for years. He’d been able to use this demo and Saucer Attack to convince Commodore to give him developer’s status for the Amiga, allowing him access to pre-release hardware. Sachs’s lovely early pictures were amongst the first to be widely distributed amongst Amiga users, making him the most well-known of the Amiga’s early hacker artists prior to Eric Graham flooring everyone with his Juggler animation in mid-1986. Indeed, Sachs was quite possibly the best Amiga painter in the world when Jacob signed him up to do Defender of the Crown — Andy Warhol included. He would become the most important single individual to work on the game. If it was unusual for an artist to become the key figure behind a game, that itself was an illustration of what made Cinemaware — and particularly Defender of the Crown — so different from what had come before. As he himself was always quick to point out, Sachs by no means personally drew every single one of the many lush scenes that make up the game. At least seven others contributed art, an absolutely huge number by the standards of the time, and another sign of what made Defender of the Crown so different from everything that had come before. It is fair to say, however, that Sachs’s virtual brush swept over every single one of the game’s scenes, tweaking a shadow here, harmonizing differing styles there. His title of Art Director was very well-earned.

This knight, first distributed by Jim Sachs as a picture file, would find his way into Defender of the Crown almost unaltered.

This knight, first distributed by Jim Sachs as a standalone picture, would find his way into Defender of the Crown almost unaltered.

By June of 1986 Sachs and company had provided Sculptured Software with a big pile of mouth-watering art, but Sculptured had yet to demonstrate to Jacob even the smallest piece of a game incorporating any of it. Growing concerned, Jacob flew out to Salt Lake City to check on their progress. What he found was a disaster: “Those guys were like nowhere. Literally nowhere.” Their other game for Cinemaware, SDI, was relatively speaking further along, but also far behind schedule. It seemed that this new generation of 68000-based computers had proved to be more than Sculptured had bargained for.

Desperate to meet his deadline with Mindscape, Jacob took the first steps toward his eventual abandonment of his original concept of Cinemaware as little more than a creative director and broker between developer and publisher. He hired his first actual employee beyond himself and Phyllis, a fellow named John Cutter who had just been laid off following Activision’s acquisition of his previous employer Gamestar, a specialist in sports games. Cutter, more technical and more analytical than Jacob, would become his right-hand man and organizer-in-chief for Cinemaware’s many projects to come. His first task was to remove Sculptured Software entirely from Defender of the CrownS.D.I. they were allowed to keep, but from now on they’d work on it under close supervision from Cutter. Realizing he needed someone who knew the Amiga intimately to have a prayer of completing Defender of the Crown by October 15, Jacob called up none other than R.J. Mical, developer of Intuition and GraphiCraft, and made him an offer: $26,000 if he could take Sachs’s pile of art and Jacob and Beeck’s design, plus a bunch of music Jacob had commissioned from a composer named Jim Cuomo, and turn it all into a finished game within three months. Mical simply said — according to Jacob — “I’m your man.”

Defender of the Crown

He got it done, even if it did nearly kill him. Mical insists to this day that Jacob wasn’t straight with him about the project, that the amount of work it ended up demanding of him was far greater than what he had been led to expect when he agreed to do the job. He was left so unhappy by his rushed final product that he purged his own name from the in-game credits. Sachs also is left with what he calls a “bitter taste,” feeling Jacob ended up demanding far, far more work from him than was really fair for the money he was paid. Many extra graphical flourishes and entire additional scenes that Mical simply didn’t have time or space to incorporate into the finished product were left on the cutting-room floor. Countless 20-hour days put in by Sachs and his artists thus went to infuriating waste in the name of meeting an arbitrary deadline. Sachs claims that five man-weeks work worth of graphics were thrown out for the jousting scenes alone. Neither Sachs nor Mical would ever work with Cinemaware again.

Jousting, otherwise known as occasionally knocking the other guy off his horse but mostly getting unhorsed yourself for no discernible reason

Jousting, otherwise known as occasionally knocking the other guy off his horse for no discernible reason but mostly getting unhorsed yourself.

Many gameplay elements were also cut, while even much of what did make it in has an unfinished feel about it. Defender of the Crown manages the neat trick of being both too hard and too easy. What happens on the screen in the various action minigames feels peculiarly disconnected from what you actually do with the mouse. I’m not sure anyone has ever entirely figured out how the jousting or swordfighting games are even supposed to work; random mouse twiddling and praying would seem to be the only viable tactics. And yet the Risk-style strategic game is almost absurdly easy. Most players win it — and thus Defender of the Crown as a whole — on their second if not their first try, and then never lose again.

Given this, it would be very easy to dismiss Defender of the Crown entirely. And indeed, plenty of critics have done just that, whilst often tossing the rest of Cinemaware’s considerable catalog into the trash can of history alongside it. But, as the length of this article would imply, I’m not quite willing to do that. I recognize that Defender of the Crown isn’t really up to much as a piece of game design, yet even today that doesn’t seem to matter quite as much as it ought to. Simplistic and kind of broken as it is, it’s still a more entertaining experience today than it ought to be — certainly enough so to be worth a play or two. And back in 1986… well, I united England under the Saxon banner a ridiculous number of times as a kid, long after doing so became rote. In thinking about Defender of the Crown‘s appeal, I’ve come to see it as representing an important shift not just in the way that games are made but also in the way that we experience them. To explain what I mean I need to get a bit theoretical with you, just for a moment.

Whilst indulging in a bit of theory in an earlier article, I broke down a game into three component parts: its system of rules and mechanics, its “surface” or user interface, and its fictional context. I want to set aside the middle entry in that trio and just think about rules and context today. As I also wrote in that earlier article, the rise in earnest of what I call “experiential games” from the 1950s onward is marked by an increased interest in the latter in comparison to the former, as games became coherent fictional experiences to be lived rather than mere abstract systems to be manipulated in pursuit of a favorable outcome. I see Defender of the Crown and the other Cinemaware games as the logical endpoint of that tendency. In designing the game, Bob Jacob and Kellyn Beeck started not with a mechanical concept — grand strategy, text adventure, arcade action, etc. — but with a fictional context: a recreation of those swashbuckling Hollywood epics of yore. That the mechanical system they came up with to underlie that fiction — a simplified game of Risk peppered by equally simplistic action games — is loaded with imperfections is too bad but also almost ancillary to Defender of the Crown the experience. The mechanics do the job just well enough to make themselves irrelevant. No one comes to Defender of the Crown to play a great strategy game. They come to immerse themselves in the Merry Olde England of bygone Hollywood.

For many years now there have been voices stridently opposed to the emphasis a game like Defender of the Crown places on its fictional context, with the accompanying emphasis on foreground aesthetics necessary to bring that context to life. Chris Crawford, for instance, dismisses not just this game but Cinemaware as a whole in one paragraph in On Game Design as “lots of pretty pictures and animated sequences” coupled to “weak” gameplay. Gameplay is king, we’re told, and graphics and music and all the rest don’t — or shouldn’t — matter a whit. Crawford all but critically ranks games based entirely on what he calls their “process intensity”: their ratio of dynamic, interactive code — i.e., gameplay —  to static art, sound, music, even text. If one accepts this point of view in whole or in part, as many of the more prominent voices in game design and criticism tend to do, it does indeed become very easy to dismiss the entire oeuvre of Cinemaware as a fundamentally flawed concept and, worse, a dangerous one, a harbinger of further design degradations to come.

Speaking here as someone with an unusual tolerance for ugly graphics — how else could I have written for years now about all those ugly 8-bit games? — I find that point of view needlessly reductive and rather unfair. Leaving aside that beauty for its own sake, whether found in a game or in an art museum, is hardly worthy of our scorn, the reality is that very few modern games are strictly about their mechanics. Many have joined Defender of the Crown as embodied fictional experiences. This is the main reason that many people play them today. If beautiful graphics help us to feel embodied in a ludic world, bully for them. I’d argue that the rich graphics in Defender of the Crown carry much the same water as the rich prose in, say, Mindwheel or Trinity. Personally — and I understand that mileages vary here — I’m more interested in becoming someone else or experiencing — there’s that word again! — something new to me for a while than I am in puzzles, strategy, or reflex responses in the abstract. I’d venture to guess that most gamers are similar. In some sense modern games have transcended games — i.e., a system of rules and mechanics — as we used to know them. Commercial and kind of crass as it sometimes is, we can see Defender of the Crown straining toward becoming an embodied, interactive, moving, beautiful, fictional experience rather than being just the really bad take on Risk it unquestionably also is.

A fetching lass. Those partial to redheads or brunettes have other options.

A fetching lass gives you the old come-hither stare. Those partial to redheads or brunettes also have options.

A good illustration of Defender of the Crown‘s appeal as an experiential fiction as well as perhaps a bit of that aforementioned crassness is provided by the game’s much-discussed romantic angle. No Hollywood epic being complete without a love interest for the dashing hero, you’ll likely at some point during your personal epic get the opportunity to rescue a Saxon damsel in distress from the clutches of a dastardly Norman. We all know what’s bound to happen next: “During the weeks that follow, gratitude turns to love. Then, late one night…”

Consummating the affair. Those shadows around waist-level are... unfortunate. I don't actually think they're supposed to look like what they look like...

Consummating the affair. Those shadows around waist-level are… unfortunate. I don’t think they’re actually supposed to look like what they look like, although they do give a new perspective to the name of “Geoffrey Longsword.”

After the affair is consummated, your new gal accompanies you through the rest of the game. It’s important to note here that she has no effect one way or the other on your actual success in reconquering England, and that rescuing her is actually one of the more difficult things to do in Defender of the Crown, as it requires that you engage with the pretty terrible swordfighting game; I can only pull it off if I pick as my character Geoffrey Longsword, appropriately enough the hero with “Strong” swordfighting skills. Yet your game — your story — somehow feels incomplete if you don’t manage it. What good is a hero without a damsel to walk off into the sunset with him? There are several different versions of the virgin (sorry!) that show up, just to add a bit of replay value for the lovelorn.

As I’ve written earlier, 1986 was something of a banner year for sex in videogames. The love scene in Defender of the Crown, being much more, um, graphic than the others, attracted particular attention. Many a youngster over the years to come would have his dreams delightfully haunted by those damsels. Shortly after the game’s release, Amazing Computing published an unconfirmed report from an “insider” that the love scene was originally intended to be interactive, requiring “certain mouse actions to coax the fair woman, who reacted accordingly. After consulting with game designers and project management, the programmer supposedly destroyed all copies of the source code to that scene.” Take that with what grains of salt you will. At any rate, a sultry love interest would soon become a staple of Cinemaware games, for the very good reason that the customers loved them. And anyway, Jacob himself, as he later admitted in a revelation bordering on Too Much Information, “always liked chesty women.” It was all horribly sexist, of course, something Amazing Computing pointed out by declaring Defender of the Crown the “most anti-woman game of the year.” On the other hand, it really wasn’t any more sexist than its cinematic inspirations, so I suppose it’s fair enough when taken in the spirit of homage.

Defender of the Crown

Cinemaware wasn’t shy about highlighting one of Defender of the Crown‘s core appeals. Did someone mention sexism?

The buzz about Defender of the Crown started inside Amiga circles even before the game was done. An early build was demonstrated publicly for the first time at the Los Angeles Commodore Show in September of 1986; it attracted a huge, rapt crowd. Released right on schedule that November through Mindscape, Defender of the Crown caused a sensation. Amiga owners treated it as something like a prophecy fulfilled; this was the game they’d all known the Amiga was capable of, the one they’d been waiting for, tangible proof of their chosen platform’s superiority over all others. And it became an object of lust — literally, when the gorgeously rendered Saxon maidens showed up — for those who weren’t lucky enough to own Commodore’s wunderkind.  You could spend lots of time talking about all of the Amiga’s revolutionary capabilities — or you could just pop Defender of the Crown in the drive, sit back, and watch the jaws drop. The game sold 20,000 copies before the end of 1986 alone, astounding numbers considering that the total pool of Amiga owners at that point probably didn’t number much more than 100,000. I feel pretty confident in saying that just about every one of those 80,000 or so Amiga owners who didn’t buy the game right away probably had a pirated copy soon enough. It would go on to sell 250,000 copies, the “gift that kept on giving” for Jacob and Cinemaware for years to come. While later Cinemaware games would be almost as beautiful and usually much better designed — not to mention having the virtue of actually being finished — no other would come close to matching Defender of the Crown‘s sales numbers or its public impact.

Laying seige to a castle. The Greek fire lying to the left of the catapault can't be used. It was cut from the game but not the graphics, only to be added back in in later ports.

Laying siege to a castle. The Greek fire lying to the left of the catapult can’t be used. It was cut from the game but not the graphics, only to be added back in in later ports.

Cinemaware ported Defender of the Crown to a plethora of other platforms over the next couple of years. Ironically, virtually all of the ports were much better game games than the Amiga version, fixing the minigames to make them comprehensible and reasonably entertaining and tightening up the design to make it at least somewhat more difficult to sleepwalk to victory. In a sense, it was Atari ST users who got the last laugh. That, anyway, is the version that some aficionados name as the best overall: the graphics and sound aren’t quite as good, but the game behind them has been reworked with considerable aplomb. Even so, it remained and remains the Amiga version that most people find most alluring. Without those beautiful graphics, there just doesn’t seem to be all that much point to Defender of the Crown. Does this make it a gorgeous atmospheric experience that transcends its game mechanics or just a broken, shallow game gussied up with lots of pretty pictures? Perhaps it’s both, or neither. Artistic truth is always in the eye of the beholder. But one thing is clear: we’ll be having these sorts of discussions a lot as we look at games to come. That’s the real legacy of Defender of the Crown — for better or for worse.

Defender of the Crown

(Sources: On the Edge by Brian Bagnall; Computer Gaming World of January/February 1985, March 1987 and August/September 1987; Amazing Computing #1.9, February 1987, April 1987, and July 1987; Commodore Magazine of October 1987 and November 1988; AmigaWorld of November/December 1986. Jim Sachs has been interviewed in more recent years by Kamil Niescioruk and The Personal Computer Museum. Matt Barton and Tristan Donovan have each interviewed Bob Jacob for Gamasutra.

Defender of the Crown is available for purchase for Windows and Mac from GOG.com and in the Apple Store for iOSfor those of you wanting to visit Merry Olde England for yourselves. All emulate the historically definitive if somewhat broken Amiga version, featuring the original Amiga graphics and sound.)

 
 

Tags: , ,

The 68000 Wars, Part 3: We Made Amiga, They Fucked It Up

Amiga 1000

The Commodore/Amiga honeymoon could hardly have been more idyllic. Honoring the wishes of everyone at Amiga to not get shipped off to Commodore’s headquarters in West Chester, Pennsylvania, Commodore instead moved them just ten miles from their cramped offices in Santa Clara, California, to a spacious new facility in Los Gatos, surrounded by greenery and well-tended walking paths that gave it something of the atmosphere of a university campus. The equipment at their disposal was correspondingly upgraded; instead of fighting one another for the use of a handful of aging Sage IV workstations, everyone in a significant technical role now got a brand new Sun workstation of his own. Best of all, Commodore knew when to back off. With their charges now relocated and properly equipped, they left them to it. “Commodore,” says R.J. Mical, “did the best thing they possibly could have done to make sure the product they bought was successful. They left us alone.” They were all “vastly in love with Commodore” in those early days. After all they’d just been through, how could they not be?

With Jay Miner’s chipset, the heart of their project, largely complete before the acquisition, Amiga’s focus now shifted to all of the stuff that would need to surround those chips to finish their computer, now to be called not the Amiga Lorraine but the Commodore Amiga. The need for an operating system becoming urgent, the software folks now came to the fore. The three most prominent systems programmers at Amiga each authored one layer of the software stack that would become the soul of the machine. Carl Sassenrath wrote the Exec, the kernel of a new operating system that borrowed many ideas from bigger institutional operating systems like Unix, not least among them the revolutionary capability of true preemptive multitasking. Atop that Dale Luck layered the Graphics Library, a collection of software hooks to let programmers unlock the potential of Miner’s chipset in a multitasking-friendly way, without having to bang on the hardware itself. And atop that R.J. Mical layered Intuition, a toolbox of widgets, icons, menus, windows, and dialogs to let programmers build GUI applications with a consistent look and feel.

But even as the rest of the system was coming together around it, Miner continued to tinker with his chipset. Out of these late experiments arose one of the most important capabilities of the Amiga, one absolutely key to its status as the world’s first multimedia PC. In the Amiga’s low-resolution modes of 320 X 200 and 320 X 400, Denise was normally capable of displaying up to 32 colors chosen from a palette of 4096. Miner now came up with a way of displaying any or all 4096 at once, using a technique he called “hold and modify” whereby Denise could create the color of each pixel by modifying only the red, green, or blue component of the previous pixel. He hoped it would allow programmers to create photorealistic environments for flight simulators, a special interest of his. When he realized that HAM mode updated too slowly to offer a decent frame rate for such applications, he actually requested that it be removed again from the chipset. But the chip fabricators said it would cost precious time and money to do so, and since it wasn’t hurting anything they might as well leave it in. Thank God for those bean counters. While it would indeed prove of limited utility for flight simulators and other games, HAM would allow the Amiga to display captured photographs of the real world. As advertisements for Digi-View, the first practical photorealistic digitizer to reach everyday computers, would soon put it, “Digi-View brings the world into your Amiga!” It’s that very blending of the analog world around us with the digital world inside the computer that is the key to the multimedia experience that the Amiga was first to provide. HAM mode stands as a classic object lesson in unintended consequences of technological innovation. The Amiga’s claim to historical importance would have been much shakier without it.

As 1984 turned into 1985, Commodore’s patience with the sort of endless tinkering that had led to HAM mode began to decrease; they wanted Los Gatos to just get the Amiga done already. The splashy debut the Atari ST made at the Winter Consumer Electronics Show in January spooked the brass back in West Chester. And by the spring of 1985, with the home-computer market clearly on the downturn, Commodore’s financial position was beginning to look downright precarious. They needed the Amiga, and soon.

A new hire named Howard Stolz, young and inexperienced like so many of the others, became the project’s unsung hero by crafting the external appearance for the new computer. His sleek, trim case still looks great today; whatever else you can say about that first Amiga model, it remains to this day by far the best looking Amiga ever released. Then and now, one is first struck upon seeing it by how small it is; even Apple’s contemporaneous machines look chunky and clunky next to it. And it’s full of thoughtful little touches, like the “garage” below the system unit into which one can slide the keyboard when not in use. Imprinted on the inside cover of the system unit are the signatures of the core Amiga team, an idea borrowed from the original Apple Macintosh. Amongst them is the paw print of Jay Miner’s beloved Mitchie.

The Amiga made its public debut at last on July 23, 1985, in the most surreal event in the long history of Commodore. Obviously hoping to duplicate the sort of excitement Apple had become so adept at evoking, Commodore rented New York’s Lincoln Center to put on a show the likes of which they never had before and never would again. The black-tie event sported an open bar stocked to the nines, waiters wandering through the crowd with plates of hors d’oeuvres, a laser show, a three-piece classical-music trio, and — no, really — a ballerina twirling across the stage. The Los Gatos team were all there, crammed awkwardly into rented tuxedos. Bob Pariseau, the traditional master of ceremonies of Amiga demos since the days when the Amiga Lorraine was just a tangled mass of wires and breadboards, once again narrated the proceedings, looking like a stage magician in his tux and long ponytail. The rabbit he pulled out of his hat for the occasion was perhaps the only computer in the world at that time that could have managed not to be overshadowed by all the pomp and circumstance. The crowd erupted into spontaneous applause on several occasions: when, thanks to HAM mode, the Amiga showed all 4096 colors onscreen at once; when the Amiga played a bit of “Smoke on the Water” in an appropriately distorted electric-guitar tone; when it talked in male and female voices; when that old favorite, Boing, showed up yet again. The evening concluded with Andy Warhol coming onstage to digitize and manipulate the image of Debbie Harry of Blondie fame, creating an end result reminiscent of his famous Marilyn Diptych of 1962. The Amiga, everyone had to agree, made for one hell of a show.

The Amiga enjoyed the best press of its career in the immediate aftermath of that Lincoln Center premiere, ironically well before anyone could actually buy one. Byte magazine, whose editorial voice was easily the most respected in the industry, devoted a luxurious 13 pages to a detailed technical preview of the machine, pronouncing it “the most advanced and innovative personal computer today.” Creative Computing, the industry’s most venerable and (often) most visionary publication, was even more effusive in its praise. The Amiga was not just a new computer but “a new communications medium — a dream machine, a new medium of expression” that the reviewer pronounced literally indescribable in print. Writing for Computer Gaming World, Jon Freeman pronounced that “anything your favorite computer can do, the Amiga can do better. And faster. And in stereo.”

Freeman published his games through Electronic Arts, and in writing his article on the machine was very much toeing his publisher’s line. By far the new computer’s most enthusiastic and stalwart supporter, who had followed it with interest since well before the Commodore acquisition, was EA. Trip Hawkins, still nursing his dream of EA software titles lined up on the shelf of every hipster aesthete alongside the music albums they were consciously packaged to evoke, just got right away what the Amiga could mean for computerized entertainment. For him it was the Great White Hope for an industry suffering through its first real downturn ever and struggling to understand just what had gone wrong. Receiving their first prototypes many months before the Lincoln Center premiere, EA had worked hand-in-hand with Los Gatos to refine the machine and get a jump start on writing software for it.

Thus much of the earliest software available for the Amiga came from EA, including ports of old favorites like Archon and Seven Cities of Gold as well as new titles destined to become Amiga icons: DeluxePaint, Arcticfox, Marble Madness. In the immediate wake of the Amiga’s release, while most publishers were adopting a wait-and-see position on the new machine, EA offered full-throated support via splashy multi-page editorials that ran in just about every publication in the industry.

The Amiga will revolutionize the home-computer industry. It’s the first home machine that has everything you want and need for all the major uses of a home computer, including entertainment, education, and productivity. The software we’re developing for the Amiga will blow your socks off. We think the Amiga, with its incomparable power, sound, and graphics, will give Electronic Arts and the entire industry a very bright future.

We believe that one day soon the home computer will be as important as radio, stereo, and television are today.

But so far, the computer’s promise has been hard to see. Software has been severely limited by the abstract, blocky shapes and rinky-dink sound reproduction of most home computers. Only a handful of pioneers have been able to appreciate the possibilities. But then, popular opinion once held that television was only useful for civil-defense communications.

The Amiga is advancing our medium on all fronts. For the first time, a personal computer is providing the visual and aural quality our sophisticated eyes and ears demand. Compared to the Amiga, using some other home computers is like watching black-and-white television with the sound turned off.

For the first time, software developers have the tools they need to fulfill the promise of home computing.

Two years ago, we said, “We See Farther.” Now Farther is here.

With praise like that, how could anything go wrong?

Well, anything could, and for a while there it seemed like just about everything did. After the premiere and the rapturous press it generated, much momentum was squandered as Commodore struggled to put the finishing touches on the Amiga and get the machine, so much more complicated than anything the company had built or supported before, into production. It wasn’t until November that one could hope to walk into a store and walk out with a new Amiga. Commodore’s advertising campaign that started up then was as unfocused as a confetti cannon. In lieu of a coherent argument for what the Amiga represented and why it mattered, Commodore gave the public black-and-white footage of the Baby Boom Generation and tired rhetoric about keeping up with the Joneses. Commodore had somehow decided that the best way to sell the most futuristic, technologically advanced computer on the market was to evoke… nostalgia.

Just why did EA seem to understand what the Amiga represented so much better than Commodore themselves? Why was EA so much better at selling Commodore’s computer than Commodore? EA unhesitatingly and unreservedly laid out a compelling case for the Amiga as a revolutionary technology for home entertainment. Meanwhile Commodore hedged their bets everywhere — except in the Amiga’s most obvious application as a game machine, from which they ran terrified.

Then, within weeks of the Amiga’s arrival in stores, Commodore’s advertising disappeared completely. The reason was a pretty basic one: Commodore simply couldn’t afford to pay for it anymore. The previous year had been so disastrous that they were suddenly teetering on the verge of bankruptcy.

After that magical year of 1983, when Commodore had briefly become a billion-dollar company and briefly been even bigger than Apple, there had been little but bad news on the financial front. 1984 had marked a gradual cooling of the excitement surrounding home computers. That was a problem for many companies, but few more so than Commodore: Commodore represented fully 60 percent of the home-computer hardware market by that point, and had long since axed all of their more expensive machines. For them 1984 brought the failure of the eminently fail-worthy Plus/4, an alarming buildup of Commodore 64 inventories, and a disappointing Christmas that failed to come close to the previous one. And yet their troubles were only just beginning.

In 1985 a slowing home-computer market turned into a collapsing home-computer market. Suddenly Commodore was posting massive losses, to the tune of almost $200 million in 1985 alone. Their mounting debt amounted to about the same figure. By the beginning of 1986 their unsold inventory amounted to almost half a billion dollars, and layoffs had halved their workforce from 7000 to 3500. Not only was Commodore forced to effectively give up on advertising the Amiga in the mainstream media, but they didn’t even go to the biggest party in their own industry, Winter CES, in January of 1986; they simply couldn’t afford to. Ahoy! magazine pronounced Commodore’s absence akin to “Russia resigning from the Soviet Bloc, Sly leaving the Family Stone.”

Most of the people who bought home computers in 1982 and 1983 had bailed out quickly once they realized how limited their machines really were, while the remainder already had their Commodores 64s, thank you very much. And the rest of the population, the ones who were supposed to keep buying and buying for years to come, simply weren’t interested anymore. What was Commodore supposed to do, saddled as they were with bloat like the massive West Chester campus that Jack Tramiel had bought for them at the height of 1983’s success, which they hadn’t been able to even begin to fill even then?

That was a question that lots of bankers were now asking themselves because Commodore had now fallen into default of their debt obligations. The financial community wasn’t inclined to take very much on faith when it to came to this company to which an air of the fly-by-night had always clung even in its glory years. Thus it came down to a hard-headed calculus. Was their best bet to demand their payments, forcing Commodore into bankruptcy and liquidation and giving the loaners a chance to recoup what they could? Or would it be better to wait and see if things looked likely to turn around? For agonizing weeks they held Commodore’s future in their hands while the Wall Street Journal and business pages around the world speculated on the over-under of the company being forced to fold. At last, in March of 1986, a deal was reached: Commodore would get another loan package worth $135 million with which to service their existing debt and fund their efforts to turn things around. It amounted to a lease on life of about one year.

The doors would stay open for the time being, but Commodore was now known far and wide — not least to potential Amiga buyers — as a company teetering on the edge of a financial cliff. And even if you decided it was worth risking such a major purchase from a company that looked very likely to leave the Amiga an orphan, you still had to find someplace to actually buy one. Therein lies a tale in itself.

There were two entirely separate distribution channels for computers in the mid-1980s: the network of specialized dealers, who offered service, advice, and support along with computers to their customers; and the mass merchants, big-box stores like Sears and Toys ‘R’ Us and the big consumer-electronics chains, who sold computers alongside televisions and washing machines and offered little to nothing in the way of support, competing instead almost entirely on the basis of price. Commodore under Jack Tramiel had pioneered the latter form of distribution with the VIC-20, the first truly mass-market home computer. Most people were happy to buy a relatively cheap machine, especially one meant for casual home use, through a big-box store. Those spending more money, and especially those buying a machine for use in business, preferred to safeguard their investment by going through a dealer. Thus Apple, IBM, and the many makers of IBM clones like Compaq continued to sell their more expensive machines through dealers. Commodore and Atari, makers of cheaper, home-oriented machines, sold theirs through the mass market.

Now, however, Commodore found themselves with a more expensive machine and no dealer network through which to sell it, a last little poison pill left to them by Jack Tramiel. One might say that Commodore was forced to start again from scratch — except that it was actually worse than that. In late 1982 Tramiel had destroyed what was left of Commodore’s dealer network when he dumped the successor to the VIC-20, the Commodore 64, into the mass-market channel as well, just weeks after promising his long-suffering dealers that he would do no such thing. That betrayal had put many of his dealers out of business, leaving the rest to sign on with other brands whilst saying, “Never again.” New Commodore CEO Marshall Smith was honestly trying in his stolid, conservative, steel-industry way to remove the whiff of disreputability that had always clung to the company under Tramiel. But the memories of most potential dealers were still too long, no matter how impressive the machine Commodore now had to offer them. The result was that many major American cities now sported, at best, just one or two places where you could walk in and buy an Amiga. It was a crippling disadvantage.

And so the Amiga’s early customers would largely come down to the hacker hardcore, who saw the Amiga for the revolutionary technology that it was and just couldn’t not have one, in spite of it all. The early issues of Amazing Computing, the first techie magazine to devote itself to the Amiga, have some of the flavor of the early issues of Byte. Hackers probed at the machine’s many mysteries — like this unexplained “HAM mode” that was supposed to allow one to do magical things — and published their findings for others to build upon. Given by Commodore no way to expand the Amiga beyond 512 K, they figured out how to roll their own memory expansions; ditto for hard drives. Faced with a dearth of commercial software, a fellow named Fred Fish started curating disks full of the best free software and distributing them at cost to dealers to pass on to customers; the Fred Fish Collection would eventually reach over 1100 disks. A fellow named Tim Jenison devised a digitizer and started distributing disks full of incredible full-color photographs. A fellow named Eric Graham wrote a 3-D modeller and ray tracer and started passing around a jaw-dropping animation called The Juggler that, when played in computer-shop windows, quite possibly sold more Amigas than all of Commodore’s own promotional efforts combined. User groups were formed all over the country, congregations of the Amiga faithful meeting in churches and the back rooms of public libraries. It was the last great flowering of the spirit of ’75 that had spawned the PC industry in the first place. Indeed, legendary Homebrew Computer Club member John Draper, the “Captain Crunch” who had taught Steve Wozniak and Steve Jobs how to phone phreak and wrote the first practical Apple II word processor amongst other achievements, was a prominent early Amiga user. He figured out the vagaries of Intuition long before Commodore’s official documentation arrived, publishing code samples and technical tutorials, some of which were included on Fred Fish Disk #1. If the Amiga was destined to remain a cult computer, it was going to be one heck of an interesting cult.

Still, hackers with the requisite pioneering spirit and $2000 worth of disposable income weren’t in infinite supply. Sales were sluggish, if perhaps better than one might expect in light of the perfect storm of problems against which the Amiga struggled. Commodore sold about 140,000 Amigas in the first eighteen months — most in North America, some in Europe, where the machine was introduced at last only in June of 1986. As Britain’s Commodore User wryly put it, “the Amiga didn’t exactly blow the world away.”

While Commodore would have much preferred to compare the Amiga to the Macintosh, their image as a maker of low-end home computers was hard to shake. Thus the most common point of comparison in the press became Jack Tramiel’s new Atari ST line, whose earliest days in North America were far from perfect in their own right. The vast majority of the early STs shipped to Europe; of the 50,000 STs sold during the first three months, only about 10,000 were sold in North America. Like the Amiga, the ST was hobbled in North America by a sparse and shabby dealer network; even fewer dealers wanted anything to do with Jack Tramiel’s new Atari than were willing to get onboard again with the now Tramiel-less Commodore. In January Tramiel, true to his old Commodore 64 playbook, dumped the ST into the mass market. But even then distribution continued to be a problem. Most of the retailers who had filled their warehouses with Commodore 64s a couple of years ago were very skeptical of any new machines, no matter how impressive, given the moribund state of home computers in general.

Despite it all, Atari’s marketers proved to be very adept at conjuring a sense of excitement out of all proportion to the ST’s actual sales. For months it was conventional wisdom that the ST was trouncing the Amiga, outselling it by a margin of about three to one. But in September of 1986 the game was suddenly up. Preparatory to making a first IPO of 15 percent of their stock, Atari was forced to publish a prospectus detailing their actual sales numbers. They had, it turned out, sold only about 150,000 STs to that date, 90,000 of them in Europe. It seemed the Amiga was actually slightly outselling the ST in North America, although neither platform’s numbers were exactly breathtaking. Certainly the ST’s sales were a far cry from the millions per year Jack Tramiel had confidently predicted just before its launch. The much-vaunted return of the new, lean-and-mean Atari to slim profitability in 1986 was down at least as much to a modest nostalgia-driven revival of their videogame consoles, which sold cheap but could be made even cheaper, as it was to the new ST line. Likewise, Commodore’s new 8-bit 128 model was outselling the Amiga and ST combined by a factor of at least four to one, while the old 64 was continuing to sell even better than the 128.

Yet perception, as a wise someone once said, is often reality. Nowhere is that more clearly illustrated than in the way software publishers responded to the Amiga and the ST. Makers of games and other home-oriented software were already supporting quite a number of platforms. Many were understandably reluctant to add two more. Better to choose the likely winner of the 68000 Wars and support only that one. Buying into the conventional wisdom just like everyone else, most — with Electronic Arts the glaring exception — hitched their wagon to the Atari ST, which seemed to many of them the most logical likely successor to the Commodore 64. The relative positions of Commodore and Atari seemed to have neatly reversed themselves. A few years ago Atari had offered Jay Miner’s 8-bit line of computers, more technologically impressive than anything else in the industry but a bit on the expensive side and dogged by poor or nonexistent marketing. Commodore under Jack Tramiel had come along to  trounce the Ataris with the Commodore 64, simple in design where the Ataris were baroque and in consequence much cheaper to make and sell. Now, with Tramiel in charge of Atari and Miner working with Commodore, history looked about to repeat itself in mirror image. The ST’s cause was helped by its being a more immediately accessible, understandable machine; the paradigm shift represented by the Amiga with its complex multitasking operating system placed many new demands on programmers, while the ST could pretty much be programmed like a super-Commodore 64.

Thus during 1986 many major game projects were begun on the ST rather than the Amiga, many older games ported to the ST but not the Amiga. The Amiga, despite the slim sales advantage it enjoyed at the moment, was threatened with a runaway chain reaction. As the industry was finally coming to understand, software availability was the single most important factor in most customers’ decision of which platform to buy into. These early commitments to the ST by so many publishers would result in more games and applications on the shop shelves for the ST, which would in turn result in more ST buyers, which would in turn encourage yet more software publishers to cast their lot with the ST, which would… you get the picture. Thus by the end of 1986 the mounting frustration and anger the Amiga faithful felt toward Commodore was mixed with more than a tinge of outright fear. How could Commodore, owners not only of the superior machine but the better-selling machine, have passively allowed Atari to control the narrative for so long? Nowhere was that frustration, anger, and fear more keenly felt than amongst the remnants of the old Amiga, Incorporated, in Los Gatos.

The team that had built the Amiga was gradually dispersing. David Morse, the man who had co-founded the company and so brilliantly jinked and weaved his way around Atari to bring it to a safe harbor at Commodore, was gone even before the Lincoln Center show, judging his work with Amiga essentially done and finding the life of a mere administrator to be less than enticing. Commodore installed a manager of their own at Los Gatos. Friction between the East and West Coast branches began to build from there. In December of 1985 R.J. Mical and Carl Sassenrath both left. Many others threatened to do so. They had to be begged and cajoled to stay at least long enough to properly finish the Amiga’s operating system, which had been released in a very imperfect state.

As the months passed and it became clear that the Amiga wasn’t becoming the mass-market sensation they’d so confidently expected, the folks at Los Gatos knew exactly who to blame. They regarded Commodore’s mishandling of the Amiga as nothing less than a personal betrayal. Someone printed up tee-shirts that they claimed to have found at Commodore’s marketing department: “Ready? Fire! Aim!” was printed on them. West Chester in turn saw Los Gatos as an arrogant bunch of youngsters who thought they were too cool for school. For evidence of just how far relations between Commodore and the Amiga old guard had deteriorated already by the spring of 1986, we need only look to the third revision of the operating system (version 1.2), which was being finished at that time. The Amiga folks had a habit of embedding secret messages into their software, little Easter eggs activated via obscure key sequences. Mostly these were the sort of things you might expect from talented young men a bit full of themselves: “INTUITION by =RJ Mical= Software Artist Deluxe”; “Carl  EXEC Sassenrath reminds: All things are in Flux!”; “Brought to you by not a mere Wizard, but the Wizard Extraordinaire: Dale Luck.” In the aftermath of version 1.2’s release, however, word quickly spread through the Amiga community of an uglier message: “We made Amiga, They f−−−−− it up.” It didn’t take long for word to get back to West Chester; nor was it hard for them to guess who the “they” represented. It only hardened West Chester’s perception of Los Gatos as an undisciplined romper room full of immature and ungrateful prima donnas. In June of 1986 West Chester, apparently judging the operating system to be good enough for now, brought the axe down. A whole swathe of people were cut, including Bob Pariseau, the very face of Amiga at so many presentations and trade shows.

By year’s end Los Gatos was down from a high of 80 people to just 13, Jay Miner and Dale Luck the only leftovers among the core figures we’ve met in the course of these articles. Attending a developer’s conference at about that time, Amazing Computing reported that the hostility between the Los Gatos and West Chester people was now “almost palpable,” even in this public setting. This could only end one way. In March of 1987, with the lease running out on that wonderful Los Gatos campus, Commodore’s brief-lived West Coast branch was shuttered, the few remaining employees given predictably unenticing offers to move to West Chester, which they predictably refused.

The old guard held an “Amiga Wake” to mark the end of their part in the Amiga story. It was almost exactly five years to the day after Larry Kaplan had called up Jay Miner to ask if he knew any lawyers, and just days after Commodore and Atari had finally settled their long legal battle brought on by the events that followed. The theme of the party, complete with a casket at the center of the room, might easily convince one that this was a requiem not just for the team that had built the Amiga but for the dream of the Amiga itself. Given the Amiga’s commercial fortunes at that instant, it’s very possible that many who attended  believed that to be exactly what it was. In actuality, though, the Amiga was just about to get a new lease on life in the form of two new models much more intelligently packaged, marketed, and, most of all, priced. The Atari ST also had brighter days ahead of it. Ironically, both platforms were destined to enjoy the best of their glory days not in North America, the continent they’d been built to conquer, but rather an ocean away in Europe. While the 68000 Wars had so far turned out to be more a slap-fight between two commercial pygmies than the titanic battle anticipated in the press, both of the principal combatants were just getting started.

(Sources: On the Edge by Brian Bagnall; Compute! of August 1985, September 1985, December 1985, and January 1987; Byte of August 1985, October 1985, and January 1987; Compute!’s Gazette of September 1985, November 1985, December 1985, and October 1986; InfoWorld of August 5 1985; Ahoy! of September 1985 and April 1986; Computer Gaming World of September/October 1985; Info of September/October 1985 and December/January 1986; Creative Computing of September 1985; New York Magazine of May 13 1985; New York Times of August 22 1985; Commodore User of June 1986; Amazing Computing of June 1986, January 1987, March 1987, and June 1987; Fortune of January 6 1986; PC Magazine of January 14, 1986; Commodore Magazine of May 1987; Atari ST User of November 1986. Whew!)

 
 

Tags: , , ,

ICBM

Michael Davis has created an original game based on my recent series of articles on Trinity. To say too much more about it would be to spoil it, so I’ll just tell you that it’s well worth a play — if perhaps not quite in the way you might expect. My thanks to Michael!

 

Tags: , ,

The 68000 Wars, Part 2: Jack Is Back!

Jack Tramiel, the computer executive most likely to be compared to Darth Vader. I wonder why?

Jack Tramiel, the computer executive most likely to be compared to Darth Vader. I wonder why?

In letting the March 31, 1984, deadline slip away without signing a licensing agreement with Atari, David Morse was taking a crazy risk. If he couldn’t find some way of scraping together $500,000 plus interest to repay Atari’s loan, Atari could walk away with the Amiga chipset for nothing, and Amiga would almost certainly go bust. All activity at Amiga therefore centered on getting the Lorraine ready for the Summer Consumer Electronic Show in Chicago, scheduled to begin on June 3. Summer CES was to be Amiga’s Hail Mary, their last chance to interest somebody — anybody — in what they had to offer enough to plunk down over half a million dollars just for openers, just to keep Atari from making the whole point moot.

By the time Summer CES arrived the Lorraine was a much more refined contraption than the one they had shown at Winter CES back in January, if still a long, long way from being a finished computer. Jay Miner’s custom chips had now been miniaturized and stamped into silicon, improving the machine’s reliability as much as it reduced its size. The Lorraine’s longstanding identity crisis was also now largely a thing of the past, the videogame crash and the example of the Macintosh having convinced everyone that what it ultimately needed to be was a computer, not a game console. Programmers like Carl Sassenrath, Dale Luck, and R.J. Mical had thus already started work on a proper operating system. Amiga’s computer was planned to be capable of doing everything the Mac could, but in spectacular color and with multitasking. That dream was, however, still a long way from fruition; the Lorraine could still be controlled only via a connected Sage IV workstation.

Led by software head Bob Pariseau as master of ceremonies, Amiga put on the best show they possibly could inside their invitation-only booth at Summer CES. The speech-synthesis library the software folks had put together was a big crowd-pleaser; spectators delighted in shouting out off-the-cuff phrases for the Lorraine to repeat, in either a male or female voice. But their hands-down favorite once again proved to be Boing, now dramatically enhanced: the ball now bounced side to side instead of just up and down, and a dramatic coup de grâce had been added in the form of sampled booms that moved from speaker to speaker to create a realistic soundscape. This impressive demonstration of Paula’s stereo-sound capabilities leaked beyond the confines of Amiga’s closed booth and out onto the crowded show floor, causing attendees to look around in alarm for the source of the noise.

Whatever the merits of their new-and-improved dog-and-pony show, Amiga also improved their credibility enormously by demonstrating that their chipset could work as actual computer chips and, indeed, simply by having survived and returned to CES once again. A bevy of industry heavyweights traipsed through Amiga’s booth that June: Sony, Hewlett Packard, Philips, Silicon Graphics, Apple. (Steve Jobs, ever the minimalist, allegedly scoffed at the Lorraine as over-engineered, containing too much fancy hardware for its own good.) The quantity and quality of Amiga’s write-ups in the trade press also increased significantly. Compute!, the biggest general-interest computing magazine in the country, raved that the Lorraine was “possibly the most advanced personal computer ever,” “the beginning of a completely new generation,” and “enough to make an IBM PC look like a four-function calculator.” Still, Amiga left the show without the thing they needed most: a viable alternative to Atari. With just a few weeks to go, their future looked grim. And then Commodore called.

To understand the reasons behind that phone call, we have to return to January 13, 1984, the day of that mysterious board meeting at Commodore that outraged their CEO Jack Tramiel so egregiously as to send him storming out of the building and burning rubber out of the parking lot, never to return. In his noncommittal statements to the press immediately after the divorce was made official, Tramiel said he planned to take some time to consider his next move. For now, he and his wife were going to spend a year traveling the world, to make up for all the vacations they had skipped over the course of his long career.

At the time that he said it, he seems to have meant it. He and wife Helen made it as far as Sri Lanka by April. But by that point he’d already had all he could take of the life of leisure. He and Helen returned to the United States so Jack could start a new venture to be called simply Tramel Technology. (The spelling of the name was changed to reflect the proper pronunciation of Tramiel’s last name; most Americans’ habit of mispronouncing the last syllable had always driven him crazy.) His plan was to scrape together funding and a team and build the mass-market successor to the Commodore 64. In the process, he hoped to stick it to Commodore and especially to its chairman, with whom he had always had a — to put it mildly — fraught relationship. Business had always been war to Tramiel, but now this war was personal.

To get Tramel Technology off the ground, he needed people, and almost all of the people he knew and had confidence in still worked at Commodore. Tramiel therefore started blatantly poaching his old favorites. That April and May at Commodore were marked by a mass exodus, as suddenly seemingly every other employee was quitting, all headed to the same place. Jack’s son Sam was the first; many felt it was likely Jack’s desire to turn Commodore into the Tramiel family business that had precipitated his departure in the first place. Then Tony Takai, the mastermind of Commodore’s Japanese branch; John Feagans, who was supposed to be finishing up the built-in software for Commodore’s new Plus/4 computer; Neil Harris, programmer of many of the most popular VIC-20 games; Ira Velinsky, a production designer; Lloyd “Red” Taylor, a president of technology; Bernie Witter, a vice president of finance; Sam Chin, a manager of finance; Joe Spiteri and David Carlone, manufacturing experts; Gregg Pratt, a vice president of operations. The most devastating defectors of all were Commodore’s head of engineering Shiraz Shivji and three of his key hardware engineers: Arthur Morgan, John Heonig, and Douglas Renn.

Shiraz Shivji, Jack Tramiel's favorite engineer during his post-Commodore years.

Shiraz Shivji, Jack Tramiel’s favorite engineer of his post-Commodore years.

The mass exodus amounted to a humiliating vote of no-confidence in Irving Gould’s hand-picked successor to Tramiel, a former steel executive named Marshall Smith who was as bland as his name. The loss of engineering talent in particular left Commodore, who had already been in a difficult situation, even worse off. As Commodore’s big new machine for 1984, the Plus/4, amply demonstrated, there just wasn’t a whole lot left to be done with the 8-bit technology that had gotten Commodore this far. Trouble was, their engineers had experience with very little else. Tramiel had always kept Commodore’s engineering staff to the bare minimum, a fact which largely explains why they had nothing compelling in the pipeline now beyond the underwhelming Plus/4 and its even less impressive little brother the Commodore 16. And now, having lost four more key people… well, the situation didn’t look good.

And that was what made Amiga so attractive. At first Commodore, like Atari before them, envisioned simply licensing the Amiga chipset, in the process quite probably — again like Atari — using Amiga’s position of weakness to extort from them a ridiculously good deal. But within days of opening negotiations their thinking began to change. Here was not only a fantastic chipset but an equally fantastic group of software and hardware engineers, intimately familiar with exactly the sort of next-generation 16-bit technology with which Commodore’s own remaining engineers were so conspicuously unacquainted. Why not buy Amiga outright?

On June 29, David Morse walked unexpectedly into the lobby of Atari’s headquarters and requested to see his primary point of contact there, one John Farrand. Farrand already had an inkling that something was up; Morse had been dodging his calls and finding excuses to avoid face-to-face meetings for the last two weeks. Still, he wasn’t prepared for what happened next. Morse told him that he was here to pay back the $500,000, plus interest, and sever their business relationship. He then proceeded to practically shove a check into the hands of a very confused and, soon, very irate John Farrand. Two minutes later he was gone.

The check had of course come from Commodore, given as a gesture of good faith in their negotiations with Amiga and, more practically, to keep Atari from walking away with the technology they’d now decided they’d very much like to have for themselves. Six weeks later negotiations between Commodore and Amiga ended with the purchase by the former of the latter for $27 million. David Morse had his miracle. His investors and employees got a nice payday in return for their faith. And, most importantly, his brilliant young team would get the chance to turn Miner’s chipset into a real computer all their own, designed — for the most part — their way.

It’s worth dwelling for just a moment here on the sheer audacity of the feat Morse had just pulled off. Backed against the wall by an Atari that smelled blood in the water, he had taken their money, used it to finish the chipset and the Lorraine well enough to get him a deal with their arch-rival, then paid Atari back and walked away. It all added up to a long con worthy of The Sting. No wonder Atari, who had gotten as far as starting to design the motherboard for the game console destined to house the chipset, was pissed. And yet the Atari that would soon seek its revenge would not be the same Atari as the one he had negotiated with in March. Confused yet? To understand we must, once again, backtrack just slightly.

Atari may have been a relative Goliath in contrast to Amiga’s David in early 1984, but that’s not to say that they were financially healthy. Far from it. The previous year had been a disastrous one, marked by losses of over half a billion thanks to the Great Videogame Crash. CEO Ray Kassar had left under a cloud of accusations of insider trading, mismanagement, and general incompetence; no one turns faster on a wonder boy than Wall Street. Now his successor, a once and future cigarette mogul named James Morgan, was struggling to staunch the bleeding by laying off employees and closing offices almost by the week. Parent company Warner Communications, figuring that the videogame bubble was well and truly burst, just wanted to be rid of Atari as quickly and painlessly as possible.

Jack Tramiel, meanwhile, was becoming a regular presence in Silicon Valley, looking for facilities and technologies he could buy to get Tramel Technology off the ground. In fact, he was one of the many who visited Amiga during this period, although negotiations didn’t get very far. Then one day in June he got a call from a Warner executive, asking if he’d be interested in taking Atari off their hands.

A deal was reached in remarkably little time. Tramiel would buy not the company Atari itself but the assets of its home-computer and game-console divisions; he had no interest in its other branch, standup arcade games. Said assets included property, trademarks and copyrights, equipment, product inventories, and, not least, employees. He would pay, astonishingly, nothing upfront for it all, instead agreeing to $240 million in long-term notes and giving Warner a 32 percent stake in Tramel Technology. Warner literally sold the company — or, perhaps better said, gave away the company — out from under Morgan, who was talking new products and turnaround plans one day and arrived the next to be told to clear out his executive suite to make room for Tramiel. On July 1, just two days after Morse had given back that $500,000, the biggest chunk of Atari, a company which just a couple of years before had been the fastest growing in the history of American business, became the possession of tiny Tramel Technology, which was still being run at the time out of a vacant apartment in a dodgy neighborhood. Within days Tramiel renamed Tramel Technology to Atari Corporation. For years to come there would be two Ataris: Tramiel’s Atari Corporation, maker of home computers and game consoles, and Atari Games, maker of standup arcade games. It would take quite some time to disentangle the two; even the headquarters building would be shared for some time to come.

Legal trouble between Commodore and Jack Tramiel’s new Atari started immediately. Commodore fired the first salvo, suing Shiraz Shivji and his fellow engineers. When they had decamped to join Tramiel, Commodore claimed, they had taken with them a whole raft of technical documents under the guise of “personal goods.” A court injunction issued at Commodore’s request effectively barred them from doing any work at all for Tramiel, paralyzing his plans to start working on a new computer for several weeks. Shivji and company eventually returned a set of backup tapes taken from Commodore engineering’s in-house central server, full of schematics and other documents. Perhaps tellingly in light of the computer they would soon begin to build, many of the documents related to the Commodore 900, a prototyped but never manufactured Unix workstation to be built around the 16-bit Zilog Z8000 CPU.

Sam and Leonard Tramiel, who would play a larger and larger role in the running of Atari as time went on.

Sam and Leonard Tramiel, who would play a larger and larger role in the running of Atari as time went on.

If Tramiel was looking for a way to get revenge, he was soon to find what looked like a pretty good opportunity. Whilst going through files of documents in early August, Jack’s son Leonard discovered the Amiga agreement, complete with the $500,000 cashed check from Atari to Amiga, and brought it to his father’s attention. Jack Tramiel, who had long made a practice of treating the courts as merely another field of battle in keeping with his “business is war” philosophy, thought they just might have something. But it wasn’t immediately obvious to whom the cancelled contract should belong: to Atari Games (i.e., the coin-op people), to Warner, or to his own new Atari Corporation. Some hasty negotiating secured him clear title; Warner didn’t seem to know anything about the old agreement or what it might have meant for Atari’s future had it gone off according to plan. On August 13, as Commodore and Amiga were signing the contracts and putting the bow on the Amiga acquisition and as Shivji’s engineers were starting up work again on what was now to be the next-generation Atari computer, Atari filed suit against Amiga and against David Morse personally in Santa Clara Superior Court, alleging contract fraud. In their first motion they sought a legal injunction while the case was resolved that would have stopped the work of Commodore’s newly minted Amiga division in its tracks, and for a much longer period of time than Commodore’s more straightforward suit against Shivji and company.

Thankfully for Commodore, they didn’t get the injunction. However, the legal battle thus sparked would drag on for more than two-and-a-half years. In early 1985 Atari expanded their suit dramatically, adding Commodore, who had of course been footing the legal bill for Amiga and Morse’s defense anyway, as co-defendants — alleging them in effect to have been co-conspirators with Morse and Amiga in the fraud. They also added on a bunch of patent claims, one very important one in particular relating back to a patent Atari held on the old Atari 400 and 800 designs that Jay Miner had been responsible for in the late 1970s; those designs did indeed share a lot of attributes with the chipset he had developed at Amiga. For this sin Miner personally was added to the suit as yet another co-defendant. The whole thing was finally wrapped up only in March of 1987, in a sealed settlement whose full details have never come to light. Scuttlebutt of then and now, though, would have it that Commodore came out on the losing end, forced to pay Atari’s legal costs and some amount of additional restitution — although, again, exactly how much remains unknown.

What to make of this? A careful analysis of that March 1984 document shows that Morse and Amiga abode entirely by the letter of the agreement, that they were perfectly within their rights to return Atari’s loan to them and walk away from any further business arrangements. Atari’s argument rather lay in the spirit of the deal. At its heart is a single line in the agreement to which Morse signed his name that could easily be overlooked as boilerplate, a throwaway amidst all the carefully constructed legalese: “Amiga and Atari agree to negotiate in good faith regarding the license agreement.”

Atari’s contention, which is difficult to deny, was that Morse had at no time been acting in good faith from the moment he put pen to paper on the agreement. The agreement had rather been a desperate gambit to secure enough operating capital to keep Amiga in business for a few more months and find another suitor — nothing more, nothing less. Morse had stalled and obfuscated and dissembled for almost three months, whilst he sought that better suitor. Atari alleged that he had even verbally agreed to a “will not sell to” list of companies not allowed to acquire Amiga under any circumstances even as he was negotiating with one of the most prominent entries on that list, Commodore. And when he had forced a check into Farrand’s hands to terminate the relationship, they claimed, he had done so with the shabby excuse that the chips didn’t work properly, even though the whole world had seen them in action just a few weeks before at Summer CES. No, there wasn’t a whole lot of “good faith” going on there.

That said, the ethics of Morse’s actions, or lack thereof, strike me as far from cut and dried. It’s hard for me to get too morally outraged about Morse screwing over a company that was manifestly bent on screwing him in his position of weakness by saddling him with a terrible licensing proposal, an absurd deadline, and legal leverage that effectively destroyed any hope he might have had to get a reasonable, fair licensing agreement out of them. The letter of intent he felt compelled to sign reads more like an ultimatum than a starting point for negotiations. John Farrand as well as others from Atari claimed in court that they had had no intention of exercising their legal right to go into escrow to build the Amiga chipset without paying anything else at all for it had Morse not delivered that loan repayment in the nick of time. Still, these claims must be read skeptically, especially given Atari’s own desperate business position. Certainly Morse would have been an irresponsible executive indeed to base the fate of his company on their word. If Atari had really wished to acquire the chipset and make an equitable, money-making deal for both parties, they could best have achieved that by not opening negotiations with an absurd three-week deadline that put Morse over a barrel from day one.

That, anyway, is my view. Opinions of others who have studied the issue can and do vary. I would merely caution to consider the full picture anyone eager to read too much into the fact that Atari by relative consensus won this legal battle in the end. Even leaving aside the fact that legal right does not always correspond to moral right, we should remember that other issues eventually got bound up into the case. It strikes me particularly that Atari had quite a strong argument to make for Jay Miner having violated their patents, which covered display hardware uncomfortably similar to that in the Amiga chipset, even down to a graphics co-processor very similar in form and function to the so-called “copper” housed inside Agnus. Without knowing more about the contents of the final settlement, I really can’t say more than that.

As the court battle began, the effort to build the computer that would become known as the Atari ST was also heating up. Shivji had initially been enamored with an oddball series of CPUs from National Semiconductor called the NS32000s, the first fully 32-bit CPUs to hit the industry. When they proved less impressive in reality than they were on paper, however, he quickly shifted to the Motorola 68000 that was already found in the Apple Lisa and Macintosh and the Amiga Lorraine. Generally described as a 16-bit chip, the 68000 was in some ways a hybrid 16- and 32-bit design, a fact which gave the new computer its name: “ST” stands for “Sixteen/Thirty-two Bit.” Shivji had had a very good idea even before Tramiel’s acquisition of Atari of just what he wanted to build:

There was going to be a windowing system, it was going to have bitmapped graphics, we knew roughly speaking what the [screen] resolutions were going to be, and so on. All those parameters were decided before the takeover. The idea was an advanced computer, 16/32-bit, good graphics, good sound, MIDI, the whole thing. A fun computer — but with the latest software technology.

Jack Tramiel and his sons descended on Atari and began with brutal efficiency to separate the wheat from the chaff. Huge numbers of employees got the axe from this company that had already been wracked by layoff after layoff over the past year. The latest victims were often announced impersonally by reading from a list of names in a group meeting, sometimes on the basis of impressions culled from an interview lasting all of five minutes. The bottom line was simple: who could help in an all-out effort to build a sophisticated new computer from the ground up in a matter of months? Those judged wanting in the skills and dedication that would be required were gone. Tramiel sold the equipment, even the desks they had left behind to make quick cash to throw into the ST development effort. With Amiga’s computer and who knew what else in the offing from other companies, speed was his priority. He expected his engineers, starting in August with virtually nothing other than Shivji’s rough design parameters, to build him a prototype ready to be demonstrated at the next CES show in January.

Decent graphics capabilities had to be a priority for the type of computer Shivji envisioned. Therefore the hardware engineers spent much of their time on a custom video chip that would support resolutions of up to 640 X 400, albeit only in black and white; the low-resolution mode of 320 X 200 that would be more typically used by games would allow up to 16 colors onscreen at one time from a palette of 512. That chip aside, to save time and money they would use off-the-shelf components as much as possible, such as a three-voice General Instrument sound chip that had already found a home in the popular Apple II Mockingboard sound card as well as various videogame consoles and standup arcade games. The ST’s most unusual feature would prove to be the built-in MIDI interface that let it control a MIDI-enabled synthesizer without the need for additional hardware, a strange luxury indeed for Tramiel to allow, given that he was famous for demanding that his machines contain only the hardware that absolutely had to be there in the name of keeping production costs down. (For a possible clue to why the MIDI interface was allowed, we can look to a typical ST product demonstration. Pitchmen made a habit of somewhat disingenuously playing MIDI music on the ST that was actually produced by a synthesizer under the table. It was easy — intentionally easy, many suspected — for an observer to miss the mention of the MIDI interface and think the ST was generating the music internally.) And of course in the wake of the Macintosh the ST simply had to ship with a mouse and an operating system to support it.

It was this latter that presented by far the biggest problem. While the fairly conservative hardware of the ST could be put together relatively quickly, writing a modern, GUI-based operating system for the new computer represented a herculean task. Apple, for instance, had spent years on the Macintosh’s operating system, and when the Mac was released it was still riddled with bugs and frustrations. This time around Tramiel wouldn’t be able to just slap an archaic-but-paid-for old PET BASIC ROM into the thing, as he had in the case of the Commodore 64. He needed a real operating system. Quickly. Where to get it?

He found his solution in a very surprising place: at Digital Research, whose CP/M was busily losing its last bits of business-computing market-share to Microsoft’s juggernaut MS-DOS. Digital had adopted an if-you-can’t-beat-em-join-em mentality in response. They were hard at work developing a complete Mac-like window manager that could run on top of MS-DOS or CP/M. It was called GEM, the “Graphical Environment Manager.” GEM was merely one of a whole range of similar shells that were appearing by 1985, struggling with varying degrees of failure to bring that Mac magic to the bland beige world of the IBM clones. Also among them was Microsoft’s original Windows 1.0 — another product that Tramiel briefly considered licensing for the ST. Digital got the nod because they were willing to license both GEM and a CP/M layer to run underneath it fairly cheap, always music to Jack Tramiel’s ears. The only problem was that it all currently ran only on Intel processors, not the 68000.

The small Atari team that temporarily immigrated to Digital Research's Monterey headquarters to adopt GEM to the ST.

The small Atari team that temporarily immigrated to Digital Research’s Monterey headquarters to adapt GEM to the ST.

As Shivji and his engineers pieced the hardware together, some dozen of Atari’s top software stars migrated about 70 miles down the California coast from Silicon Valley to the surfer’s paradise of Monterey, home of Digital Research. Working with wire-wrapped prototype hardware that often flaked out for reasons having nothing to do with the software it ran, dealing with the condescension of many on the Digital staff who looked down on their backgrounds as mostly games programmers, wrestling with Digital’s Intel source code that was itself still under development and thus changing constantly, the Atari people managed in a scant few months to port enough of CP/M and GEM to the ST to give Atari something to show on the five prototype machines that Tramiel unveiled at CES in Las Vegas that January. Shivji:

The really exciting thing was that in five months we actually showed the product at CES with real chips, with real PCBs, with real monitors, with real plastic. Five months previous to that there was nothing that existed. You’re talking about tooling for plastic, you’re talking about getting an enormous software task done. And when we went to CES, 85 percent of the machine was done. We had windows, we had all kinds of stuff. People were looking for the VAX that was running all this stuff.

Tramiel was positively gloating at the show, reveling in the new ST and in Atari’s new motto: “Power Without the Price.” Atari erected a series of billboards along the freeway leading from the airport to the Vegas Strip, like the famous Burma-Shave signs of old.

PCjr, $599: IBM, Is This Price Right?

Macintosh, $2195: Does Apple Need This Big A Bite?

Atari Thinks They’re Out Of Sight

Welcome To Atari Country — Regards, Jack

The trade journalists, desperate for a machine to revive the slowing home-computer revolution and with it the various publications they wrote for, ate it up. The ST — or, as the press affectionately dubbed it, the “Jackintosh” — stole the show. “At a glance,” raved Compute! magazine, “it’s hard to tell a GEM screen from a Mac screen” — except for the ST’s color graphics, of course. And one other difference was very clear: an ST with 512 K of memory and monitor would retail for less than $1000 —  less than one-third the cost of an equivalent Macintosh.

Rhapsodic press or no, Tramiel’s Atari very nearly went out of business in the months after that CES show. The Atari game consoles as well as the Atari 8-bit line of home computers were all but dead as commercial propositions, killed by the Great Videogame Crash and the Commodore 64 respectively. Thus virtually no money was coming in. You can only keep a multinational corporation in business so long by selling its old office furniture. The software team in Monterey, meanwhile, had to deal with a major crisis when they realized that CP/M just wasn’t going to work properly as the underpinning of GEM on the ST. They ended up porting and completing an abandoned Digital project to create GEMDOS, or, as it would become more popularly known, TOS: the “Tramiel Operating System.” With their software now the last hold-up to getting the ST into production and Tramiel breathing down their necks, the pressure on them was tremendous. Landon Dyer, one of the software team, delivers an anecdote that’s classic Jack Tramiel:

Jack Tramiel called a meeting. We didn’t often meet with him, and it was a big deal. He started by saying, “I hear you are unhappy.” Think of a deep, authoritarian voice, a lot like Darth Vader, and the same attitude, pretty much.

Sorry, Jack, things aren’t going all that hot. We tried to look humble, but we probably just came across as tired.

“I don’t understand why you are unhappy,” he rumbled. “You should be very happy; I am paying your salary. I am the one who is unhappy. The software is late. Why is it so late?”

Young and idealistic, I piped up: “You know, I don’t think we’re in this for the money. I think we just want to ship the best computer we can –”

Jack shut me down. “Then you won’t mind if I cut your salary in half?”

I got the message. He didn’t even have to use the Force.

Somehow they got it done. STs started rolling down production lines in June of 1985. The very first units went on sale not in the United States, where there were some hang-ups acquiring FCC certification, but rather West Germany. It was just as well, underscoring as it did Tramiel’s oft-repeated vision of the ST as an international computing platform. Indeed, the ST would go on to become a major success in West Germany and elsewhere in Europe, not only as a home computer and gaming platform but also as an affordable small-business computer, a market it would not manage to penetrate to any appreciable degree in its home country. Initial sales on both continents were gratifying, and the press largely continued to gush.

The Atari 520ST, first of a number of computers in the line.

The Atari 520ST, first of a number of computers in the line.

The praise was by no means undeserved. If the ST showed a few rough edges, inevitable products of its rushed development on a shoestring budget, it was more notable for everything it did well. A group of very smart, practical people put it together, ending up with a very sweet little computer for the money. Certainly GEM worked far, far better than a hasty port from a completely different architecture had any right to — arguably better, in fact, than Amiga’s soon-to-be-released homegrown equivalent, the Workbench. The ST really was exactly what Jack Tramiel had claimed it would be: a ridiculous amount of computing power for the price. That made it easier to forgive this “Jackintosh’s” failings in comparison to a real Macintosh, like its squat all-in-one-box case — no Tramiel computer was ever likely to win the sorts of design awards that Apple products routinely scooped up by the fistful even then — and materials and workmanship that weren’t quite on the same par with the Mac as were the ST’s raw specs. The historical legacy of the ST as we remember it today is kind of a tragic one in that it has little to do with the machine’s own considerable merits. The tragedy of the ST would be to be merely a very good machine, whereas its two 68000-based points of habitual comparison, the Apple Macintosh and the Commodore Amiga, together pioneered the very paradigm of computing and, one might even say, of living that we know today.

Speaking of which: just where was Commodore in the midst of all this? That’s a question many in the press were asking. Commodore had made an appearance at that January 1985 CES, but only to show off a new 8-bit computer, the last they would ever make: the Commodore 128. An odd, Frankenstein’s monster hybrid of a computer, it seemed a summary of the last ten years of 8-bit development crammed into one machine, sporting both of the microprocessors that made the PC revolution, the Zilog Z-80 and the MOS 6502 (the latter was slightly modified and re-badged the 8502). Between them they allowed for three independent operating modes: CP/M, a 99.9 percent compatible Commodore 64 mode, and the machine’s unique new 128 mode. This latter addressed most of the 64’s most notable failings, including its lack of an 80-column display, its atrocious BASIC that gave access to none of the machine’s graphics or sound capabilities (the 128’s BASIC 7.0 in contrast was amongst the best 8-bit BASICs ever released), and its absurdly slow disk drives (the 128 transferred data at six or seven times the speed of the 64). Despite being thoroughly overshadowed by the ST in CES show reports, the 128 would go on to considerable commercial success, to the tune of some 4 million units sold over the next four years.

Still, it was obvious to even contemporary observers that the Commodore 128 represented the past, the culmination of the line that had begun back in 1977 with the Commodore PET. What about the future? What about Amiga? While Tramiel and his sons trumpeted their plans for the ST line to anyone who would listen, Commodore was weirdly silent about goings-on inside its new division. The press largely had to make do with rumor and innuendo: Commodore had sent large numbers of prototypes to a number of major software developers, most notably Electronic Arts; the graphics had gotten even better since those CES shows; Commodore was planning a major announcement for tomorrow, next week, next month. The Amiga computer became the computer industry’s unicorn, oft-discussed but seldom glimpsed. This, of course, only increased its mystique. How would it compare to the Jackintosh and the Macintosh? What would it do? How much would it cost? What would it, ultimately, be? And just why the hell was it taking so long? A month after Atari started shipping STs — that machine had gone from a back-of-a-napkin proposal to production in far less time than it had taken Commodore to merely finish their own 68000-based computer — people would at long last start to get some answers.

(Sources: On the Edge by Brian Bagnall; New York Times of July 3 1984, August 21 1984, and August 29 1984; Montreal Gazette of July 12 1984 and July 14 1984; Compute! of August 1984, February 1985, March 1985, April 1985, July 1985, August 1985, and October 1985; STart of Summer 1988; InfoWorld of September 17 1984 and December 17 1984; Wall Street Journal of March 25 1984; Philadelphia Inquirer of April 19 1985. Landon Dyer’s terrific memories of working as part of Atari’s GEM team can be found on his blog as a part 1 and a part 2. Finally, Marty Goldberg’s once again shared a lot of insights and information on the legal battle between Atari and Commodore, including some extracts from actual court transcripts, although once again our conclusions about it are quite different. Regardless, my heartfelt thanks to him! Most of the pictures in this article come from STart magazine’s history of the ST, as referenced above.)

 

Tags: , , ,

The 68000 Wars, Part 1: Lorraine

This is what a revolutionary technology looks like. In very early 1986 Tim Jenison, founder of NewTek, began distributing these full-color digitized photographs, the first of their kind ever to be seen on a PC screen, to Amiga software exchanges. The age of multimedia computing had arrived.

This is what a revolutionary technology looks like. In very early 1986 Tim Jenison, founder of NewTek, began distributing these full-color digitized photographs, the first of their kind ever to be seen on a PC screen, to Amiga public-domain software exchanges. The age of multimedia computing had arrived.

The Amiga was the damnedest computer. A riddle wrapped in a mystery inside an enigma, then all crammed into a plastic case; that was the Amiga. I wrote a book about the thing, and I’m still not sure I can make sense of all of its complications and contradictions.

The Amiga was a great computer when it made its debut in 1985, better by far than anything else on the market. At its heart was the wonderchip of the era, the Motorola 68000, the same CPU found in the Apple Macintosh and the Atari ST. But what made the Amiga special was the stuff found around the 68000: three custom chips with the unforgettable names of Paula, Denise, and Agnus. Together they gave the Amiga the best graphics and sound in the industry by a veritable order of magnitude. And by relieving the 68000 of a huge chunk of the burden for generating graphics and sound as well as performing many other tasks, such as disk access, they let the Amiga dazzle while also running rings around the competition in real-world performance by virtually any test you cared to name. It all added up not just to incremental improvement but rather to that rarest thing in any field of endeavor: a generational leap.

Guru Meditation

The Amiga, especially in its original 1985 incarnation, was a terrible computer. The operating system that shipped with it was painfully buggy. If you could manage to use the machine for just an hour or two without it inexplicably running out of memory and crashing you were doing well. Other glitches were bizarrely entertaining if they didn’t happen to you personally, such as the mysterious “date virus” that could start to spread through all your disks, setting the timestamp on every file to sometime in the year 65,000 and slowing the system to a crawl. (No, this “virus” wasn’t actual malware, just a weird bug.) Of course, software could be and to a large extent eventually was fixed. Other problems were more intractable. There was, for instance, the machine’s use of interlaced video for its higher resolution modes, which caused those marvelous graphics to flicker horribly in most color combinations. Baffled users who felt like their swollen eyeballs were about to pop right out of their heads after a few hours of trying to work like this could expect to be greeted with a lot of technical explanations of why it was happening and suggestions for changing their onscreen color palettes to try to minimize it. Certainly anyone who picked up an Amiga expecting an experience similar to the famously easy-to-use Macintosh was in for a disappointment. Despite the Amiga’s sporting a superficially similar mouse-and-windows interface, users hoping to get serious work or play done on the Amiga would need to educate themselves on such technical minutiae as the difference between “chip” and “fast” memory and learn what a program’s “stack” was and how to set it manually. Even on a good day the Amiga always felt like a house of cards ready to be blown over by the first breath of wind. When the breeze came, the user was left staring at an inscrutable “Guru Meditation Error” and a bunch of intimidating numbers. Sometimes the Amiga could seem positively designed to confound.

The Amiga anticipated the future, marked the beginning of a new era. It pointed forward to the way we live and compute today. I titled my book on the machine The Future Was Here for a reason. That aforementioned generational leap in graphics and sound was the most significant in the history of the personal computer in that it made the Amiga not just a new computer but something qualitatively new under the sun: the world’s first multimedia PC. With an Amiga you could for the first time store and play back in an aesthetically pleasing way imagery and sound captured from the real world, and combine and manipulate and interact with it within the digital environment inside the computer. This changed everything about the way we compute, the way we play, and eventually the way we live, making possible everything from the World Wide Web to the iPod, iPad, and iPhone. Almost as significantly, the Amiga pioneered multitasking on a PC, another feature enabled largely by that magnificent hardware that was able to stretch the 68000 so much farther than other computers. There is considerable psychological research today that indicates that, for better or for worse, multitasking has literally changed the way we think, changed our brains — not a bad claim to fame for any commercial gadget. When you listen to music whilst Skyping on-and-off with a friend whilst trying to get that term paper finished whilst looking for a new pair of shoes on Amazon, you are what the Amiga wrought.

The Amiga was stuck in the past way of doing things, thus marking the end of an era as well as the beginning of one. It was the punctuation mark at the end of the wild-and-wooly first decade of the American PC, the last time an American company would dare to release a brand new machine that was completely incompatible with what had come before. Its hardware design reflected the past as much as the future. Those custom chips, coupled together and to the 68000 so tightly that not a cycle was wasted, were a beautiful piece of high-wire engineering created by a bare handful of brilliant individuals. If a computer can be a work of art, the Amiga certainly qualified. Yet its design was also an evolutionary dead end; the custom chips and all the rest were all but impossible to pull apart and improve without breaking all of the software that had come before. The future would lie with modular, expandable design frameworks like those employed by the IBM PC and its clones, open hardware (and software) standards that were nowhere near as sexy or as elegant but that could grow and improve with time.

The Amiga was a great success, the last such before the Wintel hegemony expanded to dominate home computing like it already did business by the mid-1980s. Its gaming legacy is amongst the richest of any platform ever, including some fifteen years worth of titles that, especially during the first half of that period, broke boundaries at every turn and expanded the very notion of what a computer game could be. I won’t even begin to list here the groundbreaking classics that were born on the Amiga; suffice to say that they’ll be featuring in this blog for years to come. The Amiga was so popular a gaming platform in Europe that it survived many years after the death of its corporate parent Commodore, a phenomenon unprecedented in consumer computing. The last of the many glossy newsstand magazines devoted to it, Britain’s Amiga Active, didn’t cease publication until November of 2001, well over seven years after the platform became an orphan. It would prove to be just as long-lived in its other major niche of video-production workstation. Thanks to their unique ability to blend their own visuals with analog video signals — enabled, ironically, by those very same interlaced video modes that drove so many users crazy — Amigas could be found in the back rooms of small cable stations and video producers into the 2000s. Only the great changeover to digital HD broadcasting finally and definitively put an end to the Amiga’s career in this realm.

The Amiga was a bitter failure, one of the great might-have-beens of computer history. In 1985 so many expected it to become so much more than just another game machine or even “just” the pioneer of the whole new field of desktop video, forerunner of the YouTube generation. The Amiga, believed its early adopters, was so much better — not just technically better but conceptually better — than what was already out there that it was surely destined to conquer the world. After all, business-software heavy hitters like WordPerfect, Borland, Ashton-Tate, and Lotus knew a good thing when they saw it, were already porting their applications to it. And yet in the end only WordPerfect came through, for a while, and, while the Amiga did change the world in the long term, its innovations were refined and made into everyday life by Apple and Microsoft rather than the Amiga itself. The vast majority of heirs to the Amiga’s legacy today — a number which includes virtually every citizen of the developed world — have no idea a computer called the Amiga ever existed.

That’s just a sample of the contradictions awaiting any writer who tries to seriously tackle the Amiga as a subject. And there’s also another, more ironic sort of difficulty to be confronted: the sheer love the Amiga generated on the part of so many who had one. The Amiga, I must confess, was my own first computing love. Since that day in 1994 when I gave in and bought my first Wintel machine, I’ve been platform-agnostic. Linux and Apple zealots and Microsoft apologists all leave me cold, leave me wondering how people can get so passionate about any platform not called Amiga. Of course I’m smart enough to realize that none of this is really all that important, that a gadget is just that, a means to an end. I even recognize that, had the Amiga not come along when it did to pioneer a new paradigm for computing, something else would have. That’s just how history works. But still, there was something special about the Amiga for those of us who were there, something going far beyond even a hacker’s typical love for his first computer.

To say Amiga users had — still have — a reputation for zealotry hardly begins to state the case. General-computing magazines from the late 1980s until well into the 1990s learned to expect a deluge of hate mail from Amiga users every time they published an article that dared say an unfavorable word about the platform — or, worse, and as inevitably happened more and more frequently as time went on and the Amiga faded further from prominence, that didn’t mention it at all. Prominent mainstream columnist John C. Dvorak liked to say that, whereas Mac users were just arrogant and self-righteous, Amiga users were actively delusional. There are still folks out there clinging to their 25-year-old Amigas, patched together with the proverbial duct tape and baling wire, as their primary computing platform. A disturbing number of them are still waiting for the day when the Amiga shall rise again and take over the world, even as it’s hard to understand what a modern Amiga should even be or why it should exist in a world that long since incorporated all of the platform’s best ideas into slicker, simpler gadgets.

Every good cult needs an origin myth, and the Cult of Amiga is no exception. Beginning already in the machine’s North American heyday of the late 1980s, High Priest R.J. Mical, developer of the Amiga’s Intuition library of GUI widgets as well as other critical pieces of its software infrastructure, began traveling to trade shows and conventions telling in an unabashedly sentimental way the story of those earliest days, when the Amiga was being developed by a tiny independent company, itself called simply Amiga, Incorporated.

We were trying to find people that had fire, that had spirit, that had a dream they were trying to accomplish. Carl Sassenrath, the guy that did the Exec for the machine, it was his lifelong dream to do a multitasking operating system that would be a work of art, that would be a thing of beauty. Dale Luck, the guy that did the graphics, this was his undying dream since he was in college to do this incredible graphics stuff.

We were looking for people with that kind of passion, that kind of spirit. More than anything else, the thing that we were looking for was people who were trying to make a mark on the world, not just in the industry but on the world in general. We were looking for people that really wanted to make a statement, that really wanted to do an incredibly great thing, not just someone who was looking for a job.

Yes. Well. While idealism certainly has its place in the Amiga story, the story is also a very down-to-earth tale of competition inside Silicon Valley. It begins in 1982 with an old friend of ours: Larry Kaplan, one of the Fantastic Four game programmers from Atari who founded Activision along with Jim Levy.

Activison was flying high in 1982, the Fantastic Four provided in Kaplan’s own words with “limousine service, company cars, and a private chef” on top of a base salary of $150,000. Yet Kaplan, who is often described by others as the very apotheosis of “the grass is always greener,” was restless. He had the idea to form another company, one all his own this time, to enter the booming Atari VCS market. One day in early 1982 he called up an old colleague of his from the Atari days: Jay Miner, who had designed the Atari VCS’s display chip, then gone on to design the chipset at the heart of the Atari 400 and 800 home computers. Kaplan, along with two others of the Fantastic Four, had written the operating system and BASIC language implementation for those machines. He thus knew Miner well. Knowing the vagaries of business and starting his own company somewhat less well than he knew Miner and programming, his initial query was a simple one: “I’d like to start a company. Do you know any lawyers?”

Miner, who had left Atari at around the same time as the Fantastic Four out of a similar disgust with new CEO Ray Kassar, had also left Silicon Valley to move to Freeport, Texas, where he worked for a small semiconductor company called Zymos, designing chips for pacemakers and other medical devices. Miner said that, no, he wasn’t particularly well-acquainted with any lawyers, good or otherwise, but that his boss, Zymos founder Bert Braddock, had a pretty good head for business. He made the introduction, and Kaplan and Braddock hit it off. The plan that Kaplan presented to him was to combine hardware and software in the booming home videogame space, offering hardware to improve on the Atari VCS’s decidedly limited capabilities along with game cartridges that took advantage of the additional gadgetry. Such a scheme was hardly original to him; confronted with the VCS’s enormous popularity and equally enormous limitations, others were already working the same space. For example, two other former Atari engineers, Bob Brown and Craig Nelson, had already formed Starpath to develop a “Supercharger” hardware expansion for the VCS as well as games to play with it. (Starpath would go on to merge with the newly renamed Epyx — née Automated Simulations — and write games like Summer Games.)

Nevertheless, Braddock sensed a potentially fruitful partnership in the offing for a maker of chips like his Zymos. He found Kaplan some investors in nearby oil-rich Houston to put up the first $1 million or so to get the company off the ground. He also found and recruited one Dave Morse, a vice president of marketing at Tonka Toys, to join Kaplan, believing him to be exactly the savvy business mind and shrewd negotiator the venture needed. An informal agreement was reached amongst the group: Morse would run the new company; Kaplan would write the games; Miner (working under contract, being still employed by Zymos) would design the ancillary hardware; and Zymos would manufacture the hardware and the game cartridges. Somewhere at the back of everyone’s mind was the idea that, if they were successful with their games and add-on gadgets, they might just be able to take the next step: to make a complete original game console of their own, the successor to the Atari VCS that Ray Kassar’s Atari didn’t seem all that interested in seriously pursuing.

In June of 1982, Kaplan announced to his shocked colleagues at Activision that he was moving on to do his own thing; the bridges he thus burnt have never been mended to this day. He and Morse opened a small office in Santa Clara, California, for their new company, which Kaplan named Hi-Toro. Morse and Braddock — truly a sugar daddy to die for for a fledgling corporation — beat the bushes over the months that followed for additional financing, with success to the tune of another $5 million or so. The majority were dentists and other members of the medical establishment, thanks to Braddock’s connections in that field. They knew little to nothing about computer technology, but knew very well that videogames were hot, and were eager to get in on the ground floor of another Atari.

And then the squirrely Larry Kaplan nearly undid the whole thing. He called Atari founder Nolan Bushnell that October to talk up his new company, hoping to convince him to join Hi-Toro as chairman of the board; a name like his would confer instant legitimacy. Instead the hunter became the hunted. Bushnell, who was legendary for the buckets of charm at his fingertips, convinced Kaplan to come to him, convinced him they could start a new videogame company to rival Atari together, without Zymos or Morse or Miner. Just like that, Kaplan tendered his second shocking resignation of 1982. In the end, as Kaplan later put it, “Nolan, of course, flaked out,” leaving him high and dry, if quite possibly deservedly so. He would end up completing the circle by going back to Atari before the year was up, but that gig ended when the Great Videogame Crash of 1983 hit. Widely regarded as too untrustworthy to be worth the trouble inside the industry by that point, Kaplan’s career never recovered. On the plus side, he was able to cash out his Activision stock following that company’s IPO, making him quite a wealthy man and making future work largely optional anyway — not the worst of petards for a modern-day Claudius.

Dave Morse, meanwhile, was also left high and dry, with a company and an office and lots of financing but nobody to design his products. He asked Jay Miner to leave Zymos and join him full-time at Hi-Toro, to help fill the vacuum left by Kaplan’s departure. Miner, who had been nursing for some time now a dream of doing a game console and/or a computer based around the new Motorola 68000 and who saw Hi-Toro as just possibly his one and only chance to do that, agreed — so long as he could bring his beloved cockapoo Mitchy with him to the office every day.

One of the first things to go after Kaplan left was the company name he had come up with. Everyone Morse and Miner spoke to agreed that “Hi-Toro” was a terrible name that made one think of nothing so much as lawn mowers. Morse therefore started flipping through a dictionary one day, looking for something that would come before Apple and Atari in corporate directories. He hit upon the Spanish word for “friend”: “amigo.” That had a nice ring to it, especially with “user-friendliness” being one of the buzzwords of the era. But the feminine version of the word — “amiga” — sounded even better, friendly and elegant maybe even a little bit sexy. Miner by his own later admission was ambivalent about the new name, but everyone Morse spoke to seemed very taken with it, so he let it go. Thus did Hi-Toro become Amiga.

Of course, Morse and Miner couldn’t do all the work by themselves. Over the months that followed they assembled a team whose names would go down in hacker lore. An old colleague from Atari who had worked with Miner on the VCS as well as the 400 and 800, Joe Decuir, came in under a temporary contract to help Miner start work on a new set of custom chips. A few other young hardware engineers were hired as full-time employees. Morse hired one Bob Pariseau to put together a software team; he became essentially the equivalent of Jay Miner on that side of the house. The software people would soon grow to outnumber the hardware people. Among their ranks were now-legendary Amiga names like R.J. Mical, Dale Luck, and Carl Sassenrath.

The folks who came to work at Amiga were almost universally young and largely inexperienced. While tarring them with the clichéd “dreamers and misfits” label may be going too far, it is true that their backgrounds were more diverse than the Silicon Valley norm; Mical, for instance, was a failed English major who had recently spent nine months backpacking his way around the world. While their youthful idealism would do much to give the eventual Amiga computer its character, there was also a very practical reason that Morse had to fill his office with all these bright young sparks: what with financing getting harder and harder to come by as the videogame industry began to go distinctly soft, he simply couldn’t afford to pay for more experienced hands. Amiga’s financial difficulties provided the opportunity of a lifetime to a bunch of folks that may have struggled to get in the door in even the most junior of positions at someplace like Apple, IBM, or Microsoft.

The glaring exception to the demographic rule at Amiga was Jay Miner himself. Creative, bleeding-edge engineering is normally a young person’s game. Miner, however, was fully 50 years old when he created his masterpiece, the Amiga chipset. He’d been designing circuits already twenty years before the microprocessor even existed and well before some of his colleagues around the office were even born. Thanks perhaps to intermittent but chronic kidney problems that would eventually kill him at age 62, he looked and in some ways acted even older than his years, favoring quiet, contemplative hobbies like cultivating bonsai trees and carving model airplanes out of balsa wood. Adjectives like “fatherly” rival “soft-spoken” and “wise” in popularity when people who knew him remember him today. While the higher-strung Dave Morse became the face Amiga showed to the outside world, Miner set the internal tone, tolerating and even encouraging the cheerful insanity that was life inside the Amiga offices. Miner:

The great things about working on the Amiga? Number one I was allowed to take my dog to work, and that set the tone for the whole atmosphere of the place. It was more than just companionship with Mitchy — the fact that she was there meant that the other people wouldn’t be too critical of some of those we hired, who were quite frankly weird. There were guys coming to work in purple tights and pink bunny slippers. Dale Luck looked like your average off-the-street homeless hippy with long hair and was pretty laid-back. In fact the whole group was pretty laid-back. I wasn’t about to say anything — I knew talent when I saw it and even Parasseau who spread the word was a bit weird in a lot of ways. The job gets done and that’s all that matters. I didn’t care how solutions came about even if people were working at home.

The question of just what this group was working on, and when, is a harder question to answer than you might expect. When we use the word “Amiga” to refer to this era, we could be talking about any of three possibilities. Firstly, there’s Amiga the company, which during its early months put well over half of its personnel and resources into games and add-ons for the old Atari VCS rather than revolutionary new technology. Then there’s the Amiga chipset being designed by Miner and his team. And finally there’s a completed game console and/or computer to incorporate the chipset. Making sense of this tangle is complicated by revisionist retellings, which tend to find grand plans and coherent narratives where none actually existed. So, let’s take a careful look at each of these Amigas, one at a time.

The Amiga Joyboard

The Amiga Joyboard

Kaplan’s original plan had envisioned Hi-Toro/Amiga as a maker first and foremost of cartridges and hardware add-ons for the VCS, with a whole new console possibly to follow if things went gangbusters. These plans got reprioritized somewhat when Kaplan left and Miner came aboard with his eagerness to do a console and/or computer, but they were by no means entirely discarded. Thus Amiga did indeed create a handful of original games over the course of 1983, along with joysticks and other hardware. By far the most innovative and best-remembered of these products was something called the Joyboard: a large, flat slab of plastic on which the player stood and leaned side to side and front to back to control a game in lieu of a joystick. Amiga packaged a skiing game, Mogul Maniac, with the Joyboard, and developed at least two more — a surfing game called Surf’s Up and a pattern-matching exercise called Off Your Rocker — that never saw release. The Joyboard and its companion products have been frequently characterized as little more than elaborate ruses designed to keep the real Amiga project under wraps. In reality, though, Morse had high commercial hopes for this side of his company; he was in fact depending on these products to fund the other side of the operation. He spent quite lavishly to give the Joyboard a splashy introduction at the New York Toy Fair in February of 1983, and briefly hired former Olympic skier Suzy Chaffee — better known to a generation of Americans as “Suzy Chapstick” thanks to her long-running endorsement of that brand — to serve as spokesperson. His plans were undone by the Great Videogame Crash. The peripherals and games all failed miserably, precipitating a financial crisis at Amiga to which I’ll return shortly.

The chips were always Jay Miner’s babies. Known in the early days as Portia, Daphne, and Agnus, later iterations would see Portia renamed to Paula and Daphne to Denise. Combined with a 68000, they offered unprecedented audiovisual capabilities, including a palette of 4096 colors and four-channel stereo sound. Their most innovative features were the so-called “copper” and “blitter” housed inside Agnus. The former, which could also be found in a less advanced version in Miner’s previous Atari 400 and 800, could run short programs independent of the CPU to change the display setup on the fly in response to the perpetually repainting electron gun behind the television or monitor reaching certain points in its cycle. This opened the door to a whole universe of visual trickery. The blitter, meanwhile, could be programmed to copy blocks of memory from place to place at lightning speeds, and in the process perform transformations and combinations on the data  — once again, independent of the CPU. It was a miracle worker in the realm of fast animation. While not programmable in the same sense as the copper and the blitter, Denise autonomously handled the task of actually painting the display, while Paula could autonomously play back up to four sound samples or waveforms at a time, and also independently handle input and output to disk. (This is the briefest of technical summaries of the Amiga chipset. For a detailed description of the chipset’s internal workings as well as many important aspects of its host platform’s history that I’ll never get to in this game-focused blog, I point you again to my own book on the subject.)

Amiga’s ultimate vision for their chipset — whether in the form of a game console, a computer, a standup arcade game, or all three — is the most difficult part of all their tangled skein of intentionality to unravel, and the one most subject to revisionist history. Amiga fanatics of later years, desperate to have their platform accepted as a “serious” computer like the IBM PC or Apple Macintosh, became rather ashamed of its origins in the videogame industry. This has occasionally led them to say that the Amiga was always secretly intended to be a computer, that the videogame plans were just there to fool the investors and keep the money flowing. In truth, there’s good reason to question whether there was any real long-term plan at all. Miner noted in later interviews that the company was quite split on the subject, with — ironically in light of his later status of Amiga High Priest — R.J. Mical on the “investors’ side,” pushing for a low-cost game console, while others like Dale Luck and Carl Sassenrath wanted an Amiga computer. Miner himself claimed to have envisioned a console that could be expanded into a real computer with the addition of an optional keyboard and disk drive. (Amiga also had similar plans for the Atari VCS in the form of something to be called the Amiga Power Module, yet another project killed by the videogame collapse.) Dave Morse, who died in 2007, is not on record at all on the subject. One suspects that he was simply in wait-and-see mode through much of 1983.

What is clear is that the first Amiga machine to be shown to the public wasn’t so much a prototype of a real or potential computer or game console as the most minimalist possible frame to show off the capabilities of the Amiga chipset. Named after Morse’s wife, the Amiga Lorraine began to come together in the dying days of 1983, in a mad scramble leading up to the Winter Consumer Electronics Show that was scheduled to begin on January 4. Any mad scientist would have been proud to lay claim to the contraption. Miner and his team built their chipset, destined eventually to be miniaturized and etched into silicon, out of off-the-shelf electronics components, creating a pile of breadboards large enough to fill a kitchen table, linked together by a spaghetti-like tangle of wires, often precariously held in place with simple alligator clips. It had no keyboard or other input method; the software team wrote programs for it on a workstation-class 68000-based computer called the Sage IV, then uploaded them to the Lorraine and ran them via a cabled connection. The whole mess was a nightmare to maintain, with wires constantly falling off, pieces overheating, or circuits shorting out seemingly at random. But when it worked it provided the first tangible demonstration of Miner’s extraordinary design. Amiga accordingly packed it all up and transported it — very carefully! — to Las Vegas for its coming-out party at Winter CES.

R.J. Mical and Dale Luck, amongst others, had worked feverishly to create a handful of demos to show off in a private corner of Amiga’s CES booth, open only by invitation to hand-selected members of the press and industry. The hit of the bunch, written by Mical and Luck at the show itself in one feverish all-night hacking session fueled by “a six pack of warm beer,” was a huge, checked soccer ball that bounced up and down, prototype of one of the most famous computerized demos of all time. The bouncing soccer ball — the “boing” ball — would soon become the unofficial symbol of Amiga.


Boing and the other demos were impressive, but the hardware was obviously still in a very rough state, still a long, long way away from any sort of salable product. Many observers were frankly skeptical whether this mass of breadboards and wires even could be turned into the three chips Amiga promised, and if so whether those chips could, complicated as they must inevitably be, be cost-effectively manufactured. Two obvious applications of the chipset, to a new videogame console or to standup arcade games, were facing a gale-force headwind following the Great Videogame Crash of the previous year. Nobody wanted anything to do with that market anymore. And introducing yet another incompatible computer into the market, no matter how impressive its hardware, looked like a high-risk proposition as well. Thus most visitors were impressed but carefully noncommittal. Was there really a place for Amiga’s admittedly extraordinary technology? That was the question. Tellingly, of the glossy magazines, only Creative Computing bothered to write about Lorraine in any real detail, excitedly declaring it to have “the most amazing graphics and sound that will ever have been offered in the consumer market.” (Just to show that prescience isn’t always an either/or proposition, the same journalist, John J. Anderson, noted how important it would be to make sure any eventual Amiga computer was compatible with the IBM PCjr, which was sure to take over the industry.)

Thus Amiga’s coming-out party is best characterized as having mixed results on the whole, leading to lots of impressed observers but no new investors. And that was a big, big problem because Amiga was quickly running out of money. With the VCS products having not only failed to sell but also absorbed millions in their own right to develop, Amiga’s financial picture was getting more desperate by the week. One thing was becoming clear: there was no way they were going to be able to secure the investment needed to turn the Lorraine into a completed computer — or a completed anything else — and market it themselves. It seemed that they had three options: license the technology to someone else with deeper pockets, sell themselves outright to someone else, or go quietly out of business. As the founders mortgaged their houses to make payroll and Morse begged his creditors for loan extensions, the only company that seemed seriously interested in the Amiga chipset was the one Jay Miner would least prefer to get in bed with once again: Atari.

An Atari old-timer named Mike Albaugh had first visited Amiga well before the CES show, in November of 1983. He was given an overview of the as-yet-extant-on-paper-only chipset’s features and, knowing very well the capabilities of Jay Miner, expressed cautious interest. After their first tangible glimpse of the chipset’s capabilities at CES, Atari got serious about acquiring this incredible technology from a company that seemed all but at their mercy, desperate to make a deal that would let them stay alive a little longer. With no other realistic options on the table, Dave Morse negotiated with Atari as best he could from his position of weakness. Atari had no interest in buying a completed machine, whether of the game-console or computer variety. They just wanted that wonderful chipset. The preliminary letter of intent that Amiga and Atari signed on March 7, 1984, reflects this.

That same letter of intent, and the $500,000 that Atari transferred to Amiga as part of it, would lead to a legal imbroglio lasting years. The specifics that the letter contained, as well as — equally importantly — what it did not contain, remain persistently misunderstood to this day. Thankfully, the original agreement has been preserved and made available online by Atari historians Marty Goldberg and Curt Vendel. I’ve taken the time to parse this document closely, and also enlisted the aid of a couple of acquaintances with better legal and financial minds than my own. Because it’s so critical to the story of Amiga, and because it’s been so widely misunderstood and misconstrued, I think it’s worth taking a moment here to look fairly closely at its specifics.

The document outlines a proposed arrangement granting Atari exclusive license to the chipset for use in home videogame consoles and standup arcade games, in perpetuity from the time that the finalized agreement is signed. The proposal also grants Atari a nonexclusive license to use the chips in a personal computer, subject to the restriction that Atari may first offer an add-on kit to turn a game console using the chips into a full-blown computer in June of 1985, and a standalone computer using the chips only in March of 1986. Before and continuing after Atari makes their computer using the chips, Amiga may make one of their own, but may only sell it through specialized computer dealers, not mass merchandisers like Sears or Toys ‘R’ Us. Atari, conversely, will be restricted to the mass merchandisers. The obvious intention here is to target Amiga’s products to the high-end, professional market, Atari’s to gamers and casual users. Atari will pay Amiga a royalty of $2 per computer or game console containing the chipset sold, $15 per standup arcade videogame. Note that the terms I’ve just described are only a proposal pending a finalized license agreement, without legal force — unless certain things happen to automatically trigger their going into effect, which I’ll get to momentarily.

Now let’s look at the parts of the document that do have immediate legal force. Amiga being starved for cash and still needing to do considerable work to complete the chipset, Atari will give Amiga an immediate “loan” of $500,000, albeit one which they never really expect to see paid back; again, I’ll explain why momentarily. Atari will then continue to give Amiga more loans on a milestone basis: $1 million when a finalized licensing agreement is signed; $500,000 when each of the three chips is completed and delivered to Atari ready for manufacturing. And here’s where things get tricky: once all of the chips are delivered and a licensing agreement is in place, Amiga’s outstanding loan obligations will be converted into a purchase by Atari of $3 million worth of Amiga stock. If, on the other hand, a finalized licensing agreement has not been signed by March 31 — just three weeks from the date of this preliminary agreement — Amiga will be expected to pay back the $500,000 to Atari by June 30, plus interest of 120 percent of the current Bank of America prime rate, assuming some other deal is not negotiated in the interim. If Amiga cannot or will not do so, the proposed licensing agreement outlined above will automatically go into effect as a legally binding contract, with the one very significant change that Atari will not need to pay any royalties at all — the license “shall be fully paid in exchange for cancellation of the loan.” The Amiga chipset thus serves as collateral for the loan, its blueprints and technical specifications being held in escrow by a neutral third party (the Bank of America).

There are plenty of other technicalities — for instance, Atari will be allowed to bill Amiga for their time and other resources if Amiga fails to complete the chipset, thus forcing Atari’s engineers to finish the job — but I believe I’ve covered the salient points here. (Those deeply interested or skeptical of my conclusions may want to look at a more detailed summary I prepared, or, best of all, just have a look at the original.) Looking at the contract, what jumps out first is that it wasn’t a particularly good deal for Amiga. To pay a mere $2 per console or computer sold when the chipset being paid for must be the component that literally makes that console or computer what it is seems shabby indeed. For Atari it would have represented the steal of the century. Why would Morse sign such an awful deal?

The obvious answer must of course be that he was desperate. While it’s perhaps dangerous to ascribe too much motivation to a dead man who never publicly commented on the subject, circumstantial evidence would seem to characterize this agreement as the wind-up to a final Hail Mary, a way to secure a quick $500,000 for the here and now, to keep the lights on a little longer and hope for a miracle. Morse did not sign a final licensing agreement by March 31, a very risky move indeed, as it gave Atari the right to automatically start using Amiga’s chipset, without having to pay Amiga another cent, if Morse couldn’t negotiate some other arrangement with them or find some way to pay back the $500,000 plus interest before June 30. Carl Sassenrath once described Morse as “my model for how to be cool in business.” Truly he must have had nerves of steel. And, incredibly, he would get his miracle.

(Sources: On the Edge by Brian Bagnall. Amiga User International of June 1988 and March 1993. Info of January/February 1987 and July/August 1988. Creative Computing of April 1984. Amazing  Computing, premiere issue. InfoWorld of July 12 1982. Commander of August 1983. Scott Stilphen’s interview with Larry Kaplan on the 2600 Connection website. Thanks also to Marty Goldberg for patiently corresponding with me and giving me Atari’s perspective, although I believe his conclusions about the Amiga/Atari negotiations and particularly his reading of the March 7 1984 agreement to be in error. And yeah, there’s my own book too…)

 
 

Tags: , , ,