RSS

Tag Archives: amiga

Lemmings 2: The Tribes

When the lads at DMA Design started making the original Lemmings, they envisioned that it would allow you to bestow about twenty different “skills” upon your charges. But as they continued working on the game, they threw more and more of the skills out, both to make the programming task simpler and to make the final product more playable. They finally ended up with just eight skills, the perfect number to neatly line up as buttons along the bottom of the screen. In the process of this ruthless culling, Lemmings became a classic study in doing more with less in game design: those eight skills, combined in all sorts of unexpected ways, were enough to take the player through 120 ever-more-challenging levels in the first Lemmings, then 100 more in the admittedly less satisfying pseudo-sequel/expansion pack Oh No! More Lemmings.

Yet when the time came to make the first full-fledged sequel, DMA resurrected some of their discarded skills. And then they added many, many more of them: Lemmings 2: The Tribes wound up with no less than 52 skills in all. For this reason not least, it’s often given short shrift by critics, who compare its baggy maximalism unfavorably with the first game’s elegant minimalism. To my mind, though, Lemmings 2 is almost a Platonic ideal of a sequel, building upon the genius of the original game in a way that’s truly challenging and gratifying to veterans. Granted, it isn’t the place you should start; by all means, begin with the classic original. When you’ve made it through those 120 levels, however, you’ll find 120 more here that are just as perplexing, frustrating, and delightful — and with even more variety to boot, courtesy of all those new skills.



The DMA Design that made Lemmings 2 was a changed entity in some ways. The company had grown in the wake of the first game’s enormous worldwide success, such that they had been forced to move out of their cozy digs above a baby store in the modest downtown of Dundee, Scotland, and into a more anonymous office in a business park on the outskirts of town. The core group that had created the first Lemmings — designer, programmer, and DMA founder David Jones; artists and level designers Mike Dailly and Gary Timmons; programmer and level designer Russell Kay — all remained on the job, but they were now joined by an additional troupe of talented newcomers.

Lemmings 2 also reflects changing times inside the games industry in ways that go beyond the size of its development team. Instead of 120 unrelated levels, there’s now a modicum of story holding things together. A lengthy introductory movie — which, in another telling sign of the times, fills more disk space than the game itself and required almost as many people to make — tells how the lemmings were separated into twelve tribes, all isolated from one another, at some point in the distant past. Now, the island (continent?) on which they live is facing an encroaching Darkness which will end all life there. Your task is to reunite the tribes, by guiding each of them through ten levels to reach the center of the island. Once all of the tribes have gathered there, they can reassemble a magical talisman, of which each tribe conveniently has one piece, and use it to summon a flying ark that will whisk them all to safety.

It’s not exactly an air-tight plot, but no matter; you’ll forget about it anyway as soon as the actual game begins. What’s really important are the other advantages of having twelve discrete progressions of ten levels instead of a single linear progression of 120. You can, you see, jump around among all these tribes at will. As David Jones said at the time of the game’s release, “We want to get away from ‘you complete a level or you don’t.'” When you get frustrated banging your head against a single stubborn level — and, this being a Lemmings game, you will get frustrated — you can just go work on another one for a while.

Rather than relying largely on the same set of graphics over the course of its levels, as the original does, each tribe in Lemmings 2 has its own audiovisual theme: there are beach-bum lemmings, Medieval lemmings, spooky lemmings, circus lemmings, alpine lemmings, astronaut lemmings, etc. In a tribute to the place where the game was born, there are even Scottish Highland lemmings (although Dundee is actually found in the less culturally distinctive — or culturally clichéd — Lowlands). And there’s even a “classic” tribe that reuses the original graphics; pulling it up feels a bit like coming home from an around-the-world tour.


Teaching Old Lemmings New Tricks

In this Beach level, a lemming uses the “kayak” skill to cross a body of water.

In this Medieval level, one lemming has become an “attractor”: a minstrel who entrances all the lemmings around him with his music, keeping them from marching onward. Meanwhile one of his colleagues is blazing a trail in front for the rest to eventually follow.

In this Shadow level, the lemming in front has become a “Fencer.” This allows him to dig out a path in front of himself at a slight upward angle. (Most of the skills in the game that at first seem bewilderingly esoteric actually do have fairly simple effects.)

In this Circus level, one lemming has become a “rock climber”: a sort of super-powered version of an ordinary climber, who can climb even a canted wall like this one.

In this Polar level, a lemming has become a “roper,” making a handy tightrope up and over the tree blocking the path.

In this Space level, we’ve made a “SuperLem” who flies in the direction of the mouse cursor.


Other pieces of plumbing help to make Lemmings 2 feel like a real, holistic game rather than a mere series of puzzles. The first game, as you may recall, gives you an arbitrary number of lemmings which begin each level and an arbitrary subset of them which must survive it; this latter number thus marks the difference between success and failure. In the sequel, though, each tribe starts its first level with 60 lemmings, who are carried over through all of the levels that follow. Any lemmings lost on one level, in other words, don’t come back in the succeeding ones. It’s possible to limp to the final finish line with just one solitary survivor remaining — and, indeed, you quite probably will do exactly this with a few of the tribes the first time through. But it’s also possible to finish all but a few of the levels without killing any lemmings at all. At the end of each level and then again at the end of each tribe’s collection of levels, you’re awarded a bronze, silver, or gold star based on your performance. To wind up with gold at the end, you usually need to have kept every single one of the little fellows alive through all ten levels. There’s a certain thematic advantage in this: people often note how the hyper-cute original Lemmings is really one of the most violent videogames ever, requiring you to kill thousands and thousands of the cuties over its course. This objection no longer applies to Lemmings 2. But more importantly, it sets up an obsessive-compulsive-perfectionist loop. First you’ll just want to get through the levels — but then all those bronze and silver performances lurking in your past will start to grate, and pretty soon you’ll be trying to figure out how to do each level just that little bit more efficiently. The ultimate Lemmings 2 achievement, needless to say, is to collect gold stars across the board.

This tiered approach to success and failure might be seen as evidence of a kinder design sensibility, but in most other respects just the opposite is true; Lemmings 2 has the definite feel of a game for the hardcore. The first Lemmings does a remarkably good job of teaching you how to play it interactively over the course of its first twenty levels or so, introducing you one by one to each of its skills along with its potential uses and limitations. There’s nothing remotely comparable in Lemmings 2; it just throws you in at the deep end. While there is a gradual progression in difficulty within each tribe’s levels, the game as a whole is a lumpier affair, especially in the beginning. Each level gives you access to between one and eight of the 52 available skills, whilst evincing no interest whatsoever in showing you how to use any of them. There is some degree of thematic grouping when it comes to the skills: the Highland lemmings like to toss cabers; the beach lemmings are fond of swimming, kayaking, and surfing; the alpine lemmings often need to ski or skate. Nevertheless, the sheer number of new skills you’re expected to learn on the fly is intimidating even for a veteran of the first game. The closest Lemmings 2 comes to its predecessor’s training levels are a few free-form sandbox environments where you can choose your own palette of skills and have at it. But even here, your education can be a challenging one, coming down as it still does to trial and error.

Your first hours with the game can be particularly intimidating; as soon as you’ve learned how one group of skills works well enough to finish one level, you’re confronted with a whole new palette of them on the next level. Even I, a huge fan of the first game, bounced off the second one quite a few times before I buckled down, started figuring out the skills, and, some time thereafter, started having fun.

Luckily, once you have put in the time to learn how the skills work, Lemmings 2 becomes very fun indeed, — every bit as rewarding as the first game, possibly even more so. Certainly its level design is every bit as good — better in fact, relying more on logic and less on dodgy edge cases in the game engine than do the infamously difficult final levels of the first Lemmings. Even the spiky difficulty curve isn’t all bad; it can be oddly soothing to start on a new tribe’s relatively straightforward early levels after being taxed to the upmost on another tribe’s last level. If the first Lemmings is mountain climbing as people imagine it to be — a single relentless, ever-steeper ascent to a dizzying peak — the second Lemmings has more in common with the reality of the sport: a set of more or less difficult stages separated by more or less comfortable base camps. While it’s at least as daunting in the end, it does offer more ebbs and flows along the way.

One might say, then, that Lemmings 2 is designed around a rather literal interpretation of the concept of a sequel. That is to say, it assumes that you’ve played its predecessor before you get to it, and are now ready for its added complexity. That’s bracing for anyone who fulfills that criterion. But in 1993, the year of Lemmings 2‘s release, its design philosophy had more negative than positive consequences for its own commercial arc and for that of the franchise to which it belonged.

The fact is that Lemmings 2‘s attitude toward its sequel status was out of joint with the way sequels had generally come to function by 1993. In a fast-changing industry that was fast attracting new players, the ideal sequel, at least in the eyes of most industry executives, was a game equally welcoming to both neophytes and veterans. Audiovisual standards were changing so rapidly that a game that was just a couple of years old could already look painfully dated. What new player with a shiny new computer wanted to play some ugly old thing just to earn a right to play the latest and greatest?

That said, Lemmings 2 actually didn’t look all that much better than its predecessor either, flashy opening movie aside. Part of this was down to DMA Design still using the 1985-vintage Commodore Amiga, which was still very popular as a gaming computer in Britain and other European countries, as their primary development platform, then porting the game to MS-DOS and various other more modern platforms. Staying loyal to the Amiga meant working within some fairly harsh restrictions, such as that of having no more than 32 colors on the screen at once, not to mention making the whole game compact enough to run entirely off floppy disk; hard drives, much less CD-ROM drives, were still not common among European Amiga owners. Shortly before the release of Lemmings 2, David Jones confessed to being “a little worried” about whether people would be willing to look beyond the unimpressive graphics and appreciate the innovations of the game itself. As it happened, he was right to be worried.

Lemmings and Oh No! More Lemmings sold in the millions across a bewildering range of platforms, from modern mainstream computers like the Apple Macintosh and Wintel machines to antique 8-bit computers like the Commodore 64 and Sinclair Spectrum, from handheld systems like the Nintendo Game Boy and Atari Lynx to living-room game consoles like the Sega Master System and the Nintendo Entertainment System. Lemmings 2, being a much more complex game under the hood as well as on the surface, wasn’t quite so amenable to being ported to just about any gadget with a CPU, even as its more off-putting initial character and its lack of new audiovisual flash did it no favors either. It was still widely ported and still became a solid success by any reasonable standard, mind you, but likely sold in the hundreds of thousands rather than the millions. All indications are that the first game and its semi-expansion pack continued to sell more copies than the second even after the latter’s release.

In the aftermath of this muted reception, the bloom slowly fell off the Lemmings rose, not only for the general public but also for DMA Design themselves. The franchise’s true jump-the-shark moment ironically came as part of an attempt to re-jigger the creatures to become media superstars beyond the realm of games. The Children’s Television Workshop, the creator of Sesame Street among other properties, was interested in moving the franchise onto television screens. In the course of these negotiations, they asked DMA to give the lemmings more differentiated personalities in the next game, to turn them from anonymous marchers, each just a few pixels across, into something more akin to individualized cartoon characters. Soon the next game was being envisioned as the first of a linked series of no less than four of them, each one detailing the further adventures of three of the tribes after their escape from the island at the end of Lemmings 2, each one ripe for trans-media adaptation by the Children’s Television Workshop. But the first game of this new generation, called The Lemmings Chronicles, just didn’t work. The attempt to cartoonify the franchise was cloying and clumsy, and the gameplay fell to pieces; unlike Lemmings 2, Lemmings Chronicles eminently deserves its underwhelming critical reputation. DMA insiders like Mike Dailly have since admitted that its was developed more out of obligation than enthusiasm: “We were all ready to move on.” When it performed even worse than its predecessor, the Children’s Television Workshop dropped out; all of its compromises had been for nothing.

Released just a year after Lemmings 2, Lemmings Chronicles marked the last game in the six-game contract that DMA Design had signed with their publisher Psygnosis what seemed like an eternity ago — in late 1987 to be more specific, when David Jones had first come to Psygnosis with his rather generic outer-space shoot-em-up Menace, giving no sign that he was capable of something as ingenious as Lemmings. Now, having well and truly demonstrated their ingenuity, DMA had little interest in re-upping; they were even willing to leave behind all of their intellectual property, which the contract Jones had signed gave to Psygnosis in perpetuity. In fact, they were more than ready to leave behind the cute-and-cuddly cartoon aesthetic of Lemmings and return to more laddish forms of gaming. The eventual result of that desire would be a second, more long-lasting worldwide phenomenon, known as Grand Theft Auto.

Meanwhile Sony, who had acquired Psygnosis in 1993, continued off and on to test the waters with new iterations of the franchise, but all of those attempts evinced the same vague sense of ennui that had doomed Lemmings Chronicles; none became hits. The last Lemmings game that wasn’t a remake appeared in 2000.

It’s interesting to ask whether DMA Design and Psygnosis could have managed the franchise better, thereby turning it into a permanent rather than a momentary icon of gaming, perhaps even one on a par with the likes of Super Mario and Sonic the Hedgehog; they certainly had the sales to compete head-to-head with those other videogame icons for a few years there in the early 1990s. The obvious objection is that Mario and Sonic were individualized characters, while DMA’s lemmings were little more than a handful of tropes moving in literal lockstep. Still, more has been done with less in the annals of media history. If everyone had approached Lemmings Chronicles with more enthusiasm and a modicum more writing and branding talent, maybe the story would have turned out differently.

Many speculate today that the franchise must inevitably see another revival at some point, what with 21st-century pop culture’s tendency to mine not just the A-list properties of the past, but increasingly its B- and C-listers as well, in the name of one generation’s nostalgia and another’s insatiable appetite for kitsch. Something tells me as well that we haven’t seen the last of Lemmings, but, as of this writing anyway, the revival still hasn’t arrived.

As matters currently stand, then, the brief-lived but frenzied craze for Lemmings has gone down in history, alongside contemporaries like Tetris and The Incredible Machine, as one more precursor of the casual revolution in gaming that was still to come, with its very different demographics and aesthetics. But in addition to that, it gave us two games that are brilliant in their own right, that remain as vexing but oh-so-rewarding as they were in their heyday. Long may they march on.

One other surviving tribute to Dundee’s second most successful gaming franchise is this little monument at the entrance to the city’s Seabraes Park, erected by local artist Alyson Conway in 2013. Lemmings and Grand Theft Auto… not bad for a city of only 150,000 souls.

(Sources: the book Grand Thieves and Tomb Raiders by Magnus Anderson and Rebecca Levene; Compute! of January 1992; Amiga Format of May 1993 and the special 1992 annual; Retro Gamer 39; The One of November 1993; Computer Gaming World of July 1993.

Lemmings 2 has never gotten a digital re-release. I therefore make it available for download here, packaged to be as easy as possible to get running under DOSBox on your modern computer.)

 
 

Tags: , , ,

The 68000 Wars, Part 6: The Unraveling

Commodore International’s roots are in manufacturing, not computing. They’re used to making and selling all kinds of things, from calculators to watches to office furniture. Computers just happen to be a profitable sideline they stumbled into. Commodore International isn’t really a computer company; they’re a company that happens to make computers. They have no grand vision of computing in the future; Commodore International merely wants to make products that sell. Right now, the products they’re set up to sell happen to be computers and video games. Next year, they might be bicycles or fax machines.

The top execs at Commodore International don’t really understand why so many people love the Amiga. You see, to them, it’s as if customers were falling in love with a dishwasher or a brand of paper towels. Why would anybody love a computer? It’s just a chunk of hardware that we sell, fer cryin’ out loud.

— Amiga columnist “The Bandito,” Amazing Computing, January 1994

Commodore has never been known for their ability to arouse public support. They have never been noted for their ability to turn lemons into lemonade. But, they have been known to take a bad situation and create a disaster. At least there is some satisfaction in noting their consistency.

— Don Hicks, managing editor, Amazing Computing, July 1994

In the summer of 1992, loyal users of the Commodore Amiga finally heard a piece of news they had been anxiously awaiting for half a decade. At long last, Commodore was about to release new Amiga models sporting a whole new set of custom chips. The new models would, in other words, finally do more than tinker at the edges of the aging technology which had been created by the employees of little Amiga, Incorporated between 1982 and 1985. It was all way overdue, but, as they say, better late than never.

The story of just how this new chipset managed to arrive so astonishingly late is a classic Commodore tale of managerial incompetence, neglect, and greed, against which the company’s overtaxed, understaffed engineering teams could only hope to fight a rearguard action.



Commodore’s management had not woken up to the need to significantly improve the Amiga until the summer of 1988, after much of its technological lead over its contemporaries running MS-DOS and MacOS had already been squandered. Nevertheless, the engineers began with high hopes for what they called the “Advanced Amiga Architecture,” or the “AAA” chipset. It was to push the machine’s maximum screen resolution from 640 X 400 to 1024 X 768, while pushing its palette of available colors from 4096 to 16.8 million. The blitter and other pieces of custom hardware which made the machine so adept at 2D animation would be retained and vastly improved, even as the current implementation of planar graphics would be joined by “chunky-graphics” modes which were more suitable for 3D animation. Further, the new chips would, at the user’s discretion, do away with the flicker-prone interlaced video signal that the current Amiga used for vertical resolutions above 200 pixels, which made the machine ideal for desktop-video applications but annoyed the heck out of anyone trying to do anything else with it. And it would now boast eight instead of four separate sound channels, each of which would now offer 16-bit resolution — i.e., the same quality as an audio CD.

All of this was to be ready to go in a new Amiga model by the end of 1990. Had Commodore been able to meet that time table and release said model at a reasonable price point, it would have marked almost as dramatic an advance over the current state of the art in multimedia personal computing as had the original Amiga 1000 back in 1985.

Sadly, though, no plan at Commodore could long survive contact with management’s fundamental cluelessness. By this point, the research-and-development budget had been slashed to barely half what it had been when the company’s only products were simple 8-bit computers like the Commodore 64. Often only one engineer at a time was assigned to each of the three core AAA chips, and said engineer was more often than not young and inexperienced, because who else would work 80-hour weeks at the salaries Commodore paid? Throw in a complete lack of day-to-day oversight or management coordination, and you had a recipe for endless wheel-spinning. AAA fell behind schedule, then fell further behind, then fell behind some more.

Some fifteen months after the AAA project had begun, Commodore started a second chip-set project, which they initially called “AA.” The designation was a baseball metaphor rather than an acronym; the “AAA” league in American baseball is the top division short of the major leagues, while the “AA” league is one rung further down. As the name would indicate, then, the AA chipset was envisioned as a more modest evolution of the Amiga’s architecture, an intermediate step between the original chipset and AAA. Like the latter, AA would offer 16.8 million colors — of which 256 could be onscreen at once without restrictions, more with some trade-offs —  but only at a maximum non-interlaced resolution of 640 X 480, or 800 X 600 in interlace mode. Meanwhile the current sound system would be left entirely alone. On paper, even these improvements moved the Amiga some distance beyond the existing Wintel VGA standard — but then again, that world of technology wasn’t standing still either. Much depended on getting the AA chips out quickly.

But “quick” was an adjective which seldom applied to Commodore. First planned for release on roughly the same time scale that had once been envisioned for the AAA chipset, AA too fell badly beyond schedule, not least because the tiny engineering team was now forced to split their energies between the two projects. It wasn’t until the fall of 1992 that AA, now renamed to the “Advanced Graphics Architecture,” or “AGA,” made its belated appearance. That is to say, the stopgap solution to the Amiga’s encroaching obsolescence arrived fully two years after the comprehensive solution to the problem ought to have shipped. Such was life at Commodore.

Rather than putting the Amiga out in front of the competition, AGA at this late date could only move it into a position of rough parity with the majority of the so-called “Super VGA” graphics cards which had become fairly commonplace in the Wintel world over the preceding year or so. And with graphics technology evolving quickly in the consumer space to meet the demands of CD-ROM and full-motion video, even the Amiga’s parity wouldn’t last for long. The Amiga, the erstwhile pioneer of multimedia computing, was now undeniably playing catch-up against the rest of the industry.

The Amiga 4000

AGA arrived inside two new models which evoked immediate memories of the Amiga 2000 and 500 from 1987, the most successful products in the platform’s history. Like the old Amiga 2000, the new Amiga 4000 was the “professional” machine, shipping in a big case full of expansion slots, with 4 MB of standard memory, a large hard drive, and a 68040 processor running at 25 MHz, all for a street price of around $2600. Like the old Amiga 500, the Amiga 1200 was the “home” model, shipping in an all-in-one-case form factor without room for internal expansion, with 2 MB of standard memory, a floppy drive only, and a 68020 processor running at 14 MHz, all for a price of about $600.

The two models were concrete manifestations of what a geographically bifurcated computing platform the Amiga had become by the early 1990s. In effect, the Amiga 4000 was to be the new face of Amiga computing in North America; ditto the Amiga 1200 in Europe. Commodore would make only scattered, desultory attempts to sell each model outside of its natural market.



Although the Amiga 500 had once enjoyed some measure of success in the United States as a games machine and general-purpose home computer, those days were long gone by 1992. That year, MS-DOS and Windows accounted for 87 percent of all American computer-game sales and the Macintosh for 9 percent, while the Amiga was lumped rudely into the 4 percent labeled simply “other.” Small wonder that very few American games publishers still gave any consideration to the platform at all; what games were still available for the Amiga in North America must usually be acquired by mail order, often as pricey imports from foreign climes. Then, too, most of the other areas where the Amiga had once been a player, and as often as not a pioneer — computer-based art and animation, 3D modeling, music production, etc. — had also fallen by the wayside, with most of the slack for such artsy endeavors being picked up by the Macintosh.

The story of Eric Graham was depressingly typical of the trend. Back in 1986, Graham had created a stunning ray-traced 3D animation called The Juggler on the Amiga; it became a staple of shop windows, filling at least some of the gap left by Commodore’s inept marketing. User demand had then led him to create Sculpt 3D, one of the first two practical 3D modeling applications for a consumer-class personal computer, and release it through the publisher Byte by Byte in mid-1987. (The other claimant to the status of absolute first of the breed ran on the Amiga as well; it was called Videoscape 3D, and was released virtually simultaneously with Sculpt 3D). But by 1989 the latest Macintosh models had also become powerful enough to support Graham’s software. Therefore Byte by Byte and Graham decided to jump to that platform, which already boasted a much larger user base who tended to be willing to pay higher prices for their software. Orphaned on the Amiga, the Sculpt 3D line continued on the Mac until 1996. Thanks to it and many other products, the Mac took over the lead in the burgeoning field of 3D modeling. And as went 3D modeling, so went a dozen other arenas of digital creativity.

The one place where the Amiga’s toehold did prove unshakeable was desktop video, where its otherwise loathed interlaced graphics modes were loved for the way they let the machine sync up with the analog video-production equipment typical of the time: televisions, VCRs, camcorders, etc. From the very beginning, both professionals and a fair number of dedicated amateurs used Amigas for titling, special effects, color correction, fades and wipes, and other forms of post-production work. Amigas were used by countless television stations to display programming information and do titling overlays, and found their way onto the sets of such television and film productions as Amazing Stories, Max Headroom, and Robocop 2. Even as the Amiga was fading in many other areas, video production on the platform got an enormous boost in December of 1990, when an innovative little Kansan company called NewTek released the Video Toaster, a combination of hardware and software which NewTek advertised, with less hyperbole than you might imagine, as an “all-in-one broadcast studio in a box” — just add one Amiga. Now the Amiga’s production credits got more impressive still: Babylon 5, seaQuest DSV, Quantum Leap, Jurassic Park, sports-arena Jumbotrons all over the country. Amiga models dating from the 1980s would remain fixtures in countless local television stations until well after the millennium, when the transition from analog to digital transmission finally forced their retirement.

Ironically, this whole usage scenario stemmed from what was essentially an accidental artifact of the Amiga’s design; Jay Miner, the original machine’s lead hardware designer, had envisioned its ability to mix and match with other video sources not as a means of inexpensive video post-production but rather as a way of overlaying interactive game graphics onto the output from an attached laser-disc accessory, a technique that was briefly en vogue in videogame arcades. Nonetheless, the capability was truly a godsend, the only thing keeping the platform alive at all in North America.

On the other hand, though, it was hard not to lament a straitening of the platform’s old spirit of expansive, experimental creativity across many fields. As far as the market was concerned, the Amiga was steadily morphing from a general-purpose computer into a piece of niche technology for a vertical market. By the early 1990s, most of the remaining North American Amiga magazines had become all but indistinguishable from any other dryly technical trade journal serving a rigidly specialized readership. In a telling sign of the times, it was almost universally agreed that early sales of the Amiga 4000, which were rather disappointing even by Commodore’s recent standards, were hampered by the fact that it initially didn’t work with the Video Toaster. (An updated “Video Toaster 4000” wouldn’t finally arrive until a year after the Amiga 4000 itself.) Many users now considered the Amiga little more than a necessary piece of plumbing for the Video Toaster. NewTek and others sold turnkey systems that barely even mentioned the name “Commodore Amiga.” Some store owners whispered that they could actually sell a lot more Video Toaster systems that way. After all, Commodore was known to most of their customers only as the company that had made those “toy computers” back in the 1980s; in the world of professional video and film production, that sort of name recognition was worth less than no name recognition at all.

In Europe, meanwhile, the nature of the baggage that came attached to the Commodore name perhaps wasn’t all that different in the broad strokes, but it did carry more positive overtones. In sheer number of units sold, the Amiga had always been vastly more successful in Europe than in North America, and that trend accelerated dramatically in the 1990s. Across the pond, it enjoyed all the mass-market acceptance it lacked in its home country. It was particularly dominant in Britain and West Germany, two of the three biggest economies in Europe. Here, the Amiga was nothing more nor less than a great games machine. The sturdy all-in-one-case design of the Amiga 500 and, now, the Amiga 1200 placed it on a continuum with the likes of the Sinclair Spectrum and the Commodore 64. There was a lot more money to be made selling computers to millions of eager European teenagers than to thousands of sober American professionals, whatever the wildly different price tags of the individual machines. Europe was accounting for as much as 88 percent of Commodore’s annual revenues by the early 1990s.

And yet here as well the picture was less rosy than it might first appear. While Commodore sold almost 2 million Amigas in Europe during 1991 and 1992, sales were trending in the wrong direction by the end of that period. Nintendo and Sega were now moving into Europe with their newest console systems, and Microsoft Windows as well was fast gaining traction. Thus it comes as no surprise that the Amiga 1200, which first shipped in December of 1992, a few months after the Amiga 4000, was greeted with sighs of relief by Commodore’s European subsidiaries, followed by much nervous trepidation. Was the new model enough of an improvement to reverse the trend of declining sales and steal a march on the competition once again? Sega especially had now become a major player in Europe, selling videogame consoles which were both cheaper and easier to operate than an Amiga 1200, lacking as they did any pretense of being full-fledged computers.

If Commodore was facing a murky future on two continents, they could take some consolation in the fact that their old arch-rival Atari was, as usual, even worse off. The Atari Lynx, the handheld game console which Jack Tramiel had bilked Epyx out of for a song, was now the company’s one reasonably reliable source of revenue; it would sell almost 3 million units between 1989 and 1994. The Atari ST, the computer line which had caused such a stir when it had beat the original Amiga into stores back in 1985 but had been playing second fiddle ever since, offered up its swansong in 1992 in the form of the Falcon, a would-be powerhouse which, as always, lagged just a little bit behind the latest Amiga models’ capabilities. Even in Europe, where Atari, like Commodore, had a much stronger brand than in North America, the Falcon sold hardly at all. The whole ST line would soon be discontinued, leaving it unclear how Atari intended to survive after the Lynx faded into history. Already in 1992, their annual sales fell to $127 million — barely a seventh those of Commodore.

Still, everything was in flux, and it was an open question whether Commodore could continue to sell Amigas in Europe at the pace to which they had become accustomed. One persistent question dogging the Amiga 1200 was that of compatibility. Although the new chipset was designed to be as compatible as possible with the old, in practice many of the most popular old Amiga games didn’t work right on the new machine. This reality could give pause to any potential upgrader with a substantial library of existing games and no desire to keep two Amigas around the house. If your favorite old games weren’t going to work on the new machine anyway, why not try something completely different, like those much more robust and professional-looking Windows computers the local stores had just started selling, or one of those new-fangled videogame consoles?



The compatibility problems were emblematic of the way that the Amiga, while certainly not an antique like the 8-bit generation of computers, wasn’t entirely modern either. MacOS and Windows isolated their software environment from changing hardware by not allowing applications to have direct access to the bare metal of the computer; everything had to be passed through the operating system, which in turn relied on the drivers provided by hardware manufacturers to ensure that the same program worked the same way on a multitude of configurations. AmigaOS provided the same services, and its technical manuals as well asked applications to function this way — but, crucially, it didn’t require that they do so. European game programmers in particular had a habit of using AmigaOS only as a bootstrap. Doing so was more efficient than doing things the “correct” way; most of the audiovisually striking games which made the Amiga’s reputation would have been simply impossible using “proper” programming techniques. Yet it created a brittle software ecosystem which was ill-suited for the long term. Already before 1992 Amiga gamers had had to contend with software that didn’t work on machines with more than 512 K of memory, or with a hard drive attached, or with or without a certain minor revision of the custom chips, or any number of other vagaries of configuration. With the advent of the AGA chipset, such problems really came home to roost.

An American Amiga 1200. In the context of American computing circa 1992, it looked like a bizarrely antediluvian gadget. “Real” computers just weren’t sold in this style of all-in-one case anymore, with peripherals dangling messily off the side. It looked like a chintzy toy to Americans, whatever the capabilities hidden inside. A telling detail: notice the two blank keys on the keyboard, which were stamped with characters only in some continental European markets that needed them. Rather than tool up to produce more than one physical keyboard layout, Commodore just let them sit there in all their pointlessness on the American machines. Can you imagine Apple or IBM, or any other reputable computer maker, doing this? To Americans, and to an increasing number of Europeans as well, the Amiga 1200 just seemed… cheap.

Back in 1985, AmigaOS had been the very first consumer-oriented operating system to boast full-fledged preemptive multitasking, something that neither MacOS nor Microsoft Windows could yet lay claim to even in 1992; they were still forced to rely on cooperative multitasking, which placed them at the mercy of individual applications’ willingness to voluntarily cede time to others. Yet the usefulness of AmigaOS’s multitasking was limited by its lack of memory protection. Thanks to this lack, any individual program on the system could write, intentionally or unintentionally, all over the memory allocated to another; system crashes were a sad fact of life for the Amiga power user. AmigaOS also lacked a virtual-memory system that would have allowed more applications to run than the physical memory could support. In these respects and others — most notably its graphical interface, which still evinced nothing like the usability of Windows, much less the smooth elegance of the Macintosh desktop — AmigaOS lagged behind its rivals.

It is true that MacOS, dating as it did from roughly the same period in the evolution of the personal computer, was struggling with similar issues: trying to implement some form of multitasking where none at all had existed originally, kludging in support for virtual memory and some form of rudimentary memory protection. The difference was that MacOS was evolving, however imperfectly. While AmigaOS 3.0, which debuted with the AGA machines, did offer some welcome improvements in terms of cosmetics, it did nothing to address the operating system’s core failings. It’s doubtful whether anyone in Commodore’s upper management even knew enough about computers to realize they existed.

This quality of having one foot in each of two different computing eras dogged the platform in yet more ways. The very architectural approach of the Amiga — that of an ultra-efficient machine built around a set of tightly coupled custom chips — had become passé as Wintel and to a large extent even the Macintosh had embraced modular architectures where almost everything could live on swappable cards, letting users mix and match capabilities and upgrade their machines piecemeal rather than all at once. One might even say that it was down almost as much to the Amiga’s architectural philosophy as it was to Commodore’s incompetence that the machine had had such a devil of a time getting itself upgraded.

And yet the problems involved in upgrading the custom chips were as nothing compared to the gravest of all the existential threats facing the Amiga. It was common knowledge in the industry by 1992 that Motorola was winding down further development of the 68000 line, the CPUs at the heart of all Amigas. Indeed, Apple, whose Macintosh used the same CPUs, had seen the writing on the wall as early as the beginning of 1990, and had started projects to port MacOS to other architectures, using software emulation as a way to retain compatibility with legacy applications. In 1991, they settled on the PowerPC, a CPU developed by an unlikely consortium of Apple, IBM, and Motorola, as the future of the Macintosh. The first of the so-called “Power Macs” would debut in March of 1994. The whole transition would come to constitute one of the more remarkable sleights of hand in all computing history; the emulation layer combined with the ported version of MacOS would work so seamlessly that many users would never fully grasp what was really happening at all.

But Commodore, alas, was in no position to follow suit, even had they had the foresight to realize what a ticking time bomb the end of the 68000 line truly was. AmigaOS’s more easygoing attitude toward the software it enabled meant that any transition must be fraught with far more pain for the user, and Commodore had nothing like the resources Apple had to throw at the problem in any case. But of course, the thoroughgoing, eternal incompetence of Commodore’s management prevented them from even seeing the problem, much less doing anything about it. While everyone was obliviously rearranging the deck chairs on the S.S. Amiga, it was barreling down on an iceberg as wide as the horizon. The reality was that the Amiga as a computing platform now had a built-in expiration date. After the 68060, the planned swansong of the 68000 family, was released by Motorola in 1994, the Amiga would literally have nowhere left to go.

As it was, though, Commodore’s financial collapse became the more immediate cause that brought about the end of the Amiga as a vital computing platform. So, we should have a look at what happened to drive Commodore from a profitable enterprise, still flirting with annual revenues of $1 billion, to bankruptcy and dissolution in the span of less than two years.



During the early months of 1993, an initial batch of encouraging reports, stating that the Amiga 1200 had been well-received in Europe, was overshadowed by an indelibly Commodorian tale of making lemons out of lemonade. It turned out that they had announced the Amiga 1200 too early, then shipped it too late and in too small quantities the previous Christmas. Consumers had chosen to forgo the Amiga 500 and 600 — the latter being a recently introduced ultra-low-end model — out of the not-unreasonable belief that the generation of Amiga technology these models represented would soon be hopelessly obsolete. Finding the new model unavailable, they’d bought nothing at all — or, more likely, bought something from a competitor like Sega instead. The result was a disastrous Christmas season: Commodore didn’t yet have the new computers everybody wanted, and couldn’t sell the old computers they did have, which they’d inexplicably manufactured and stockpiled as if they’d had no inkling that the Amiga 1200 was coming. They lost $21.9 million in the quarter that was traditionally their strongest by far.

Scant supplies of the Amiga 1200 continued to devastate overall Amiga sales well after Christmas, leaving one to wonder why on earth Commodore hadn’t tooled up to manufacture sufficient quantities of a machine they had hoped and believed would be a big hit, for once with some justification. In the first quarter of 1993, Commodore lost a whopping $177.6 million on sales of just $120.9 million, thanks to a massive write-down of their inventory of older Amiga models piling up in warehouses. Unit sales of Amigas dropped by 25 percent from the same quarter of the previous year; Amiga revenues dropped by 45 percent, thanks to deep price cuts instituted to try to move all those moribund 500s and 600s. Commodore’s share price plunged to around $2.75, down from $11 a year before, $20 the year before that. Wall Street estimated that the whole company was now worth only $30 million after its liabilities had been subtracted — a drop of one full order of magnitude in the span of a year.

If anything, Wall Street’s valuation was generous. Commodore was now dragging behind them $115.3 million in debt. In light of this, combined with their longstanding reputation for being ethically challenged in all sorts of ways, the credit agencies considered them to be too poor a risk for any new loans. Already they were desperately pursuing “debt restructuring” with their major lenders, promising them, with little to back it up, that the next Christmas season would make everything right as rain again. Such hopes looked even more unfounded in light of the fact that Commodore was now making deep cuts to their engineering and marketing staffs — i.e., the only people who might be able to get them out of this mess. Certainly the AAA chipset, still officially an ongoing project, looked farther away than ever now, two and a half years after it was supposed to have hit the streets.

The Amiga CD32

It was for these reasons that the announcement of the Amiga CD32, the last major product introduction in Commodore’s long and checkered history, came as such a surprise to everyone in mid-1993. CD32 was perhaps the first comprehensively, unadulteratedly smart product Commodore had released since the Amiga 2000 and 500 back in 1987. It was as clever a leveraging of their dwindling assets as anyone could ask for. Rather than another computer, it was a games console, built around a double-speed CD-ROM drive. Commodore had tried something similar before, with the CDTV unit of two and a half years earlier, only to watch it go down in flames. For once, though, they had learned from their mistakes.

CDTV had been a member of an amorphously defined class of CD-based multimedia appliances for the living room — see also the Philips CD-I, the 3DO console, and the Tandy VIS — which had all been or would soon become dismal failures. Upon seeing such gadgets demonstrated, consumers, unimpressed by ponderous encyclopedias that were harder to use and less complete than the ones already on their bookshelves, trip-planning software that was less intuitive than the atlases already in their glove boxes, and grainy film clips that made their VCRs look high-fidelity, all gave vent to the same plaintive question: “But what good is it really?” The only CD-based console which was doing well was, not coincidentally, the only one which could give a clear, unequivocal answer to this question. The Sega Genesis with CD add-on was good for playing videogames, full stop.

Commodore followed Sega’s lead with the CD32. It too looked, acted, and played like a videogame console — the most impressive one on the market, with specifications far outshining the Sega CD. Whereas Commodore had deliberately obscured the CDTV’s technological connection to the Amiga, they trumpeted it with the CD32, capitalizing on the name’s association with superb games. At heart, the CD32 was just a repackaged Amiga 1200, in the same way that the CDTV had been a repackaged Amiga 500. Yet this wasn’t a problem at all. For all that it was a little underwhelming in the world of computers, the AGA chipset was audiovisually superior to anything the console world could offer up, while the 32-bit 68020 that served as the CD32’s brain gave it much more raw horsepower. Meanwhile the fact that at heart it was just an Amiga in a new form factor gave it a huge leg up with publishers and developers; almost any given Amiga game could be ported to the CD32 in a week or two. Throw in a price tag of less than $400 (about $200 less than the going price of a “real” Amiga 1200, if you could find one), and, for the first time in years, Commodore had a thoroughly compelling new product, with a measure of natural appeal to people who weren’t already members of the Amiga cult. Thanks to the walled-garden model of software distribution that was the norm in the console world, Commodore stood to make money not only on every CD32 sold but also from a licensing fee of $3 on every individual game sold for the console. If the CD32 really took off, it could turn into one heck of a cash cow. If only Commodore could have released it six months earlier, or have managed to remain financially solvent for six months longer, it might even have saved them.

As it was, the CD32 made a noble last stand for a company that had long made ignobility its calling card. Released in September of 1993 in Europe, it generated some real excitement, thanks not least to a surprisingly large stable of launch titles, fruit of that ease of porting games from the “real” Amiga models. Commodore sold CD32s as fast as they could make them that Christmas — which was unfortunately nowhere near as fast as they might have liked, thanks to their current financial straits. Nevertheless, in those European stores where CD32s were on-hand to compete with the Sega CD, the former often outsold the latter by a margin of four to one. Over 50,000 CD32s were sold in December alone.

The Atari Jaguar. There was some mockery, perhaps justifiable, of its “Jetsons” design aesthetic.

Ironically, Atari’s last act took much the same form as Commodore’s. In November of 1993, following a horrific third quarter in which they had lost $17.6 million on sales of just $4.4 million, they released a game console of their own, called the Jaguar, in North America. In keeping with the tradition dating back to 1985, it was cheaper than Commodore’s take on the same concept — its street price was under $250 — but not quite as powerful, lacking a CD-ROM drive. Suffering from a poor selection of games, as well as reliability problems and outright hardware bugs, the Jaguar faced an uphill climb; Atari shipped less than 20,000 of them in 1993. Nevertheless, the Tramiel clan confidently predicted that they would sell 500,000 units in 1994, and at least some people bought into the hype, sending Atari’s stock soaring to almost $15 even as Commodore’s continued to plummet.

For the reality was, the rapid unraveling of all other facets of Commodore’s business had rendered the question of the CD32’s success moot. The remaining employees who worked at the sprawling campus in West Chester, Pennsylvania, purchased a decade before when the VIC-20 and 64 were flying off shelves and Jack “Business is War” Tramiel was stomping his rival home-computer makers into dust, felt like dwarfs wandering through the ancient ruins of giants. Once there had been more than 600 employees here; now there were about 50. There was 10,000 square feet of space per employee in a facility where it cost $8000 per day just to keep the lights on. You could wander for hours through the deserted warehouses, shuttered production lines, and empty research labs without seeing another living soul. Commodore was trying to lease some of it out for an attractive rent of $4 per square foot, but, as with with most of their computers, nobody seemed all that interested. The executive staff, not wanting the stigma of having gone down with the ship on their resumes, were starting to jump for shore. Commodore’s chief financial officer threw up his hands and quit in the summer of 1993; the company’s president followed in the fall.

Apart from the CD32, for which they lacked the resources to manufacture enough units to meet demand, virtually none of the hardware piled up in Commodore’s European warehouses was selling at all anymore. In the third quarter of 1993, they lost $9.7 million, followed by $8 million in the fourth quarter, on sales of just $70 million. After a second disastrous Christmas in a row, it could only be a question of time.

In a way, it was the small things rather than the eye-popping financial figures which drove the point home. For example, the April 1994 edition of the New York World of Commodore show, for years already a shadow of its old vibrant self, was cancelled entirely due to lack of interest. And the Army and Air Force Exchange, which served as a storefront to American military personnel at bases all over the world, kicked Commodore off its list of suppliers because they weren’t paying their bills. It’s by a thousand little cuts like this one, each representing another sales opportunity lost, that a consumer-electronics company dies. At the Winter Consumer Electronics Show in January of 1994, at which Commodore did manage a tepid presence, their own head of marketing told people straight out that the Amiga had no future as a general-purpose computer; Commodore’s only remaining prospects, he said, lay with the American vertical market of video production and the European mass market of videogame consoles. But they didn’t have the money to continue building the hardware these markets were demanding, and no bank was willing to lend them any.

The proverbial straw which broke the camel’s back was a dodgy third-party patent relating to a commonplace programming technique used to keep a mouse pointer separate from the rest of the screen. Commodore had failed to pay the patent fee for years, the patent holder eventually sued, and in April of 1994 the court levied an injunction preventing Commodore from doing any more business at all in the United States until they paid up. The sum in question was a relatively modest $2.5 million, but Commodore simply didn’t have the money to give.

On April 29, 1994, in a briefly matter-of-fact press release, Commodore announced that they were going out of business: “The company plans to transfer its assets to unidentified trustees for the benefit of its creditors. This is the initial phase of an orderly voluntary liquidation.” And just like that, a company which had dominated consumer computing in the United States and much of Europe for a good part of the previous decade and a half was no more. The business press and the American public showed barely a flicker of interest; most of them had assumed that Commodore was already long out of business. European gamers reacted with shock and panic — few had realized how bad things had gotten for Commodore — but there was nothing to be done.

Thus it was that Atari, despite being chronically ill for a much longer period of time, managed to outlive Commodore in the end. Still, this isn’t to say that their own situation at the time of Commodore’s collapse was a terribly good one. When the reality hit home that the Jaguar probably wasn’t going to be a sustainable gaming platform at all, much less sell 500,000 units in 1994 alone, their stock plunged back down to less than $1 per share. In the aftermath, Atari limped on as little more than a patent troll, surviving by extracting judgments from other videogame makers, most notably Sega, for infringing on dubious intellectual property dating back to the 1970s. This proved to be an ironically more profitable endeavor for them than that of actually selling computers or game consoles. On July 30, 1996, the Tramiels finally cashed out, agreeing to merge the remnants of their company with JT Storage, a maker of hard disks, who saw some lingering value in the trademarks and the patents. It was a liquidation in all but name; only three Atari employees transitioned to the “merged” entity, which continued under the same old name of JT Storage.

And so disappeared the storied name of Atari and that of Tramiel simultaneously from the technology industry. Even as the trade magazines were publishing eulogies to the former, few were sorry to see the latter go, what with their long history of lawsuits, dirty dealing, and abundant bad faith. Jack Tramiel had purchased Atari in 1984 out of the belief that creating another phenomenon like the Commodore 64 — or for that matter the Atari VCS — would be easy. But the twelve years that followed were destined always to remain a footnote to his one extraordinary success, a cautionary tale about the dangers of conflating lucky timing and tactical opportunism with long-term strategic genius.

Even so, the fact does remain that the Commodore 64 brought affordable computing to millions of people all over the world. For that every one of those millions owes Jack Tramiel, who died in 2012, a certain debt of gratitude. Perhaps the kindest thing we can do for him is to end his eulogy there.



The story of the Amiga after the death of Commodore is long, confusing, and largely if not entirely dispiriting; for all these reasons, I’d rather not dwell on it at length here. Its most positive aspect is the surprisingly long commercial half-life the platform enjoyed in Europe, over the course of which game developers still found a receptive if slowly dwindling market ready to buy their wares. The last glossy newsstand magazine devoted to the Amiga, the British Amiga Active, didn’t publish its last issue until the rather astonishingly late date of November of 2001.

The Amiga technology itself first passed into the hands of a German PC maker known as Escom, who actually started manufacturing new Amiga 1200s for a time. In 1996, however, Escom themselves went bankrupt. The American PC maker Gateway 2000 became the last major company to bother with the aging technology when they bought it at the Escom bankruptcy auction. Afterward, though, they apparently had second thoughts; they did nothing whatsoever with it before selling it onward at a loss. From there, it passed into other, even less sure hands, selling always at a discount. There are still various projects bearing the Amiga name today, and I suspect they will continue until the generation who fell in love with the platform in its heyday have all expired. But these are little more than hobbyist endeavors, selling their products in minuscule numbers to customer motivated more by their nostalgic attachment to the Amiga name than by any practical need. It’s far from clear what the idea of an “Amiga computer” should even mean in 2020.

When the hardcore of the Amiga hardcore aren’t dreaming quixotically of the platform’s world-conquering return, they’re picking through the rubble of the past, trying to figure out where it all went wrong. Among a long roll call of petty incompetents in Commodore’s executive suites, two clear super-villains emerge: Irving Gould, Commodore’s chairman since the mid-1960s, and Mehdi Ali, his final hand-picked chief executive. Their mismanagement in the latter days of the company was so egregious that some have put it down to evil genius rather than idiocy. The typical hypothesis says that these two realized at some point in the very early 1990s that Commodore’s days were likely numbered, and that they could get more out of the company for themselves by running it into the ground than they could by trying to keep it alive. I usually have little time for such conspiracy theories; as far as I’m concerned, a good rule of thumb for life in general is never to attribute to evil intent what can just as easily be chalked up to good old human stupidity. In this case, though, there’s some circumstantial evidence lending at least a bit of weight to the theory.

The first and perhaps most telling piece of evidence is the two men’s ridiculously exorbitant salaries, even as their company was collapsing around them. In 1993, Mehdi Ali took home $2 million, making him the fourth highest-paid executive in the entire technology sector. Irving Gould earned $1.75 million that year — seventh on the list. Why were these men paying themselves as if they ran a thriving company when the reality was so very much the opposite? One can’t help but suspect that Gould at least, who owned 19 percent of Commodore’s stock, was trying to offset his losses on the one field by raising his personal salary on the other.

And then there’s the way that Gould, an enormously rich man whose personal net worth was much higher than that of all of Commodore by the end, was so weirdly niggardly in helping his company out of its financial jam. While he did loan fully $17.4 million back to Commodore, the operative word here is indeed “loan”: he structured his cash injections to ensure that he would be first in line to get his money back if and when the company went bankrupt, and stopped throwing good money after bad as soon as the threshold of the collateral it could offer up in exchange was exceeded. One can’t help but wonder what might have become of the CD32 if he’d been willing to go all-in to try to turn it into a success.

Of course, this is all rank speculation, which will quickly become libelous if I continue much further down this road. Suffice to say that questionable ethics were always an indelible part of Commodore. Born in scandal, the company would quite likely have ended in scandal as well if anyone in authority had been bothered enough by its anticlimactic bankruptcy to look further. I’d love to see what a savvy financial journalist could make of Commodore’s history. But, alas, I have neither the skill nor the resources for such a project, and the story is of little interest to the mainstream journalists of today. The era is past, the bodies are buried, and there are newer and bigger outrages to fill our newspapers.



Instead, then, I’ll conclude with two brief eulogies to mark the end of the Amiga’s role in this ongoing history. Rather than eulogizing in my own words, I’m going to use those of a true Amiga zealot: the anonymous figure known as “The Bandito,” whose “Roomers” columns in the magazine Amazing Computing were filled with cogent insights and nitty-gritty financial details every month. (For that reason, they’ve been invaluable sources for this series of articles.)

Jay Miner, the gentle genius, in 1990. In interviews like the one to which this photo was attached, he always seemed a little befuddled by the praise and love which Amiga users lavished upon him.

First, to Jay Miner, the canonical “father of the Amiga,” who died of the kidney disease he had been battling for most of his life on June 20, 1994, at age 62. If a machine can reflect the personality of a man, the Amiga certainly reflected his:

Jay was not only the inventive genius who designed the custom chips behind the Atari 800 and the Amiga, he also designed many more electronic devices, including a new pacemaker that allows the user to set their own heart rate (which allows them to participate in strenuous activities once denied to them). Jay was not only a brilliant engineer, he was a kind, gentle, and unassuming man who won the hearts of Amiga fans everywhere he went. Jay was continually amazed and impressed at what people had done with his creations, and he loved more than anything to see the joy people obtained from the Amiga.

We love you, Jay, for all the gifts that you have given to us, and all the fruits of your genius that you have shared with us. Rest in peace.

And now a last word on the Amiga itself, from the very last “Roomers” column, written by someone who had been there from the beginning:

The Amiga has left an indelible mark on the history of computing. [It] stands as a shining example of excellent hardware design. Its capabilities foreshadowed the directions of the entire computer industry: thousands of colors, multiple screen resolutions, multitasking, high-quality sound, fast animation, video capability, and more. It was the beauty and elegance of the hardware that sold the Amiga to so many millions of people. The Amiga sold despite Commodore’s neglect, despite their bumbling and almost criminal marketing programs. Developers wrote brilliantly for this amazing piece of hardware, creating software that even amazed the creators of the hardware. The Amiga heralded the change that’s even now transforming the television industry, with inexpensive CGI and video editing making for a whole new type of television program.

Amiga game software also changed the face of entertainment software. Electronic Arts launched themselves headlong into 16-bit entertainment software with their Amiga software line, which helped propel them into the $500 million giant they are today. Cinemaware’s Defender of the Crown showed people what computer entertainment could look like: real pictures, not blocky collections of pixels. For a while, the Amiga was the entertainment-software machine to have.

In light of all these accomplishments, the story of the Amiga really isn’t the tragedy of missed opportunities and unrealized potential that it’s so often framed as. The very design that made it able to do so many incredible things at such an early date — its tightly coupled custom chips, its groundbreaking but lightweight operating system — made it hard for the platform to evolve in the same ways that the less imaginative, less efficient, but modular Wintel and MacOS architectures ultimately did. While it lasted, however, it gave the world a sneak preview of its future, inspiring thousands who would go on to do good work on other platforms. We are all more or less the heirs to the vision embodied in the original Amiga Lorraine, whether we ever used a real Amiga or not. The platform’s most long-lived and effective marketing slogan, “Only Amiga Makes it Possible,” is of course no longer true. It is true, though, that the Amiga made many things possible first. May it stand forever in the annals of computing history alongside the original Apple Macintosh as one of the two most visionary computers of its generation. For without these two computers — one of them, alas, more celebrated than the other — the digital world that we know today would be a very different place.

(Sources: the books Commodore: The Final Years by Brian Bagnall and my own The Future Was Here; Amazing Computing of November 1992, December 1992, January 1993, February 1993, March 1993, April 1993, May 1993, June 1993, July 1993, September 1993, October 1993, November 1993, December 1993, January 1994, February 1994, March 1994, April 1994, May 1994, June 1994, July 1994, August 1994, September 1994, October 1994, and February 1995; Byte of January 1993; Amiga User International of June 1988; Electronic Gaming Monthly of June 1995; Next Generation of December 1996. My thanks to Eric Graham for corresponding with me about The Juggler and Sculpt 3D years ago when I was writing my book on the Amiga.

Those wishing to read about the Commodore story from the perspective of the engineers in the trenches, who so often accomplished great things in less than ideal conditions, should turn to Brian Bagnall’s full “Commodore Trilogy”: A Company on the Edge, The Amiga Years, and The Final Years.)

 

Tags: , , ,

The 68000 Wars, Part 5: The Age of Multimedia

A group of engineers from Commodore dropped in unannounced on the monthly meeting of the San Diego Amiga Users Group in April of 1988. They said they were on their way to West Germany with some important new technology to share with their European colleagues. With a few hours to spare before they had to catch their flight, they’d decided to share it with the user group’s members as well.

They had with them nothing less than the machine that would soon be released as the next-generation Amiga: the Amiga 3000. From the moment they powered it up to display the familiar Workbench startup icon re-imagined as a three-dimensional ray-traced rendering, the crowd was in awe. The new model sported a 68020 processor running at more than twice the clock speed of the old 68000, with a set of custom chips redesigned to match its throughput; graphics in 2 million colors instead of 4096, shown at non-interlaced — read, non-flickering — resolutions of 640 X 400 and beyond; an AmigaOS 2.0 Workbench that looked far more professional than the garish version 1.3 that was shipping with current Amigas. The crowd was just getting warmed up when the team said they had to run. They did, after all, have a plane to catch.

Word spread like crazy over the online services. Calls poured in to Commodore’s headquarters in West Chester, Pennsylvania, but they didn’t seem to know what any of the callers were talking about. Clearly this must be a very top-secret project; the engineering team must have committed a major breach of protocol by jumping the gun as they had. Who would have dreamed that Commodore was already in the final stages of a project which the Amiga community had been begging them just to get started on?

Who indeed? The whole thing was a lie. The tip-off was right there in the April date of the San Diego Users Group Meeting. The president of the group, along with a few co-conspirators, had taken a Macintosh II motherboard and shoehorned it into an Amiga 2000 case. They’d had “Amiga 3000” labels typeset and stuck them on the case, and created some reasonable-looking renderings of Amiga applications, just enough to get them through the brief amount of time their team of “Commodore engineers” — actually people from the nearby Los Angeles Amiga Users Group — would spend presenting the package. When the truth came out, some in the Amiga community congratulated the culprits for a prank well-played, while others were predictably outraged. What hurt more than the fact that they had been fooled was the reality that a Macintosh that was available right now had been able to impersonate an Amiga that existed only in their dreams. If that wasn’t an ominous sign for their favored platform’s future, it was hard to say what would be.

Of course, this combination of counterfeit hardware and sketchy demos, no matter how masterfully acted before the audience, couldn’t have been all that convincing to a neutral observer with a modicum of skepticism. Like all great hoaxes, this one succeeded because it built upon what its audience already desperately wanted to believe. In doing so, it inadvertently provided a preview of what it would mean to be an Amiga user in the future: an ongoing triumph of hope over hard-won experience. It’s been said before that the worst thing you can do is to enter into a relationship in the hope that you will be able to change the other party. Amiga users would have reason to learn that lesson over and over again: Commodore would never change. Yet many would never take the lesson to heart. To be an Amiga user would be always to be fixated upon the next shiny object out there on the horizon, always to be sure this would be the thing that would finally turn everything around, only to be disappointed again and again.

Hoaxes aside, rumors about the Amiga 3000 had been swirling around since the introduction of the 500 and 2000 models in 1987. But for a long time a rumor was all the new machine was, even as the MS-DOS and Macintosh platforms continued to evolve apace. Commodore’s engineering team was dedicated and occasionally brilliant, but their numbers were tiny in comparison to those of comparable companies, much less bigger ones like Apple and IBM, the latter of whose annual research budget was greater than Commodore’s total sales. And Commodore’s engineers were perpetually underpaid and underappreciated by their managers to boot. The only real reason for a top-flight engineer to work at Commodore was love of the Amiga itself. In light of the conditions under which they were forced to work, what the engineering staff did manage to accomplish is remarkable.

After the crushing disappointment that had been the 1989 Christmas season, when Commodore’s last and most concerted attempt to break the Amiga 500 into the American mainstream had failed, it didn’t take hope long to flower again in the new year. “The chance for an explosive Amiga market growth is still there,” wrote Amazing Computing at that time, in a line that could have summed up the sentiment of every issue they published between 1986 and 1994.

Still, reasons for optimism seemingly did still exist. For one thing, Commodore’s American operation had another new man in charge, an event which always brought with it the hope that the new boss might not prove the same as the old boss. Replacing the unfortunately named Max Toy was Harold Copperman, a real, honest-to-goodness computer-industry veteran, coming off a twenty-year stint with IBM, followed by two years with Apple; he had almost literally stepped offstage from the New York Mac Business Expo, where he had introduced John Sculley to the speaker’s podium, and into his new office at Commodore. With the attempt to pitch the Amiga 500 to low-end users as the successor to the Commodore 64 having failed to gain any traction, the biggest current grounds for optimism was that Copperman, whose experience was in business computers, could make inroads into that market for the higher-end Amiga models. Rumor had it that the dismissal of Toy and the hiring of Copperman had occurred following a civil war that had riven the company, with one faction — Toy apparently among them — saying Commodore should de-emphasize the Amiga in favor of jumping on the MS-DOS bandwagon, while the other faction saw little future — or, perhaps better said, little profit margin — in becoming just another maker of commodity clones. If you were an Amiga fan, you could at least breathe a sigh of relief that the right side had won out in that fight.

The Amiga 3000

It was in that hopeful spring of 1990 that the real Amiga 3000, a machine custom-made for the high-end market, made its bow. It wasn’t a revolutionary update to the Amiga 2000 by any means, but it did offer some welcome enhancements. In fact, it bore some marked similarities to the hoax Amiga 3000 of 1988. For instance, replacing the old 68000 was a 32-bit 68030 processor, and replacing AmigaOS 1.3 was the new and much-improved — both practically and aesthetically — AmigaOS 2.0. The flicker of the interlaced graphics modes could finally be a thing of the past, at least if the user sprang for the right type of monitor, and a new “super-high resolution” mode of 1280 X 400 was available, albeit with only four onscreen colors. The maximum amount of “chip memory” — memory that could be addressed by the machine’s custom chips, and thus could be fully utilized for graphics and sound — had already increased from 512 K to 1 MB with the release of a “Fatter Agnus” chip, which could be retrofitted into older examples of the Amiga 500 and 2000, in 1989. Now it increased to 2 MB with the Amiga 3000.

The rather garish and toy-like AmigaOS 1.3 Workbench.

The much slicker Workbench 2.0.

So, yes, the Amiga 3000 was very welcome, as was any sign of technological progress. Yet it was also hard not to feel a little disappointed that, five years after the unveiling of the first Amiga, the platform had only advanced this far. The hard fact was that Commodore’s engineers, forced to work on a shoestring as they were, were still tinkering at the edges of the architecture that Jay Miner and his team had devised all those years before rather than truly digging into it to make the more fundamental changes that were urgently needed to keep up with the competition. The interlace flicker was eliminated, for instance, not by altering the custom chips themselves but by hanging an external “flicker fixer” onto the end of the bus to de-interlace the interlaced output they still produced before it reached the monitor. And the custom chips still ran no faster than they had in the original Amiga, meaning the hot new 68030 had to slow down to a crawl every time it needed to access the chip memory it shared with them. The color palette remained stuck at 4096 shades, and, with the exception of the new super-high resolution mode, whose weirdly stretched pixels and four colors limited its usability, the graphics modes as a whole remained unchanged. Amiga owners had spent years mocking the Apple Macintosh and the Atari ST for their allegedly unimaginative, compromised designs, contrasting them continually with Jay Miner’s elegant dream machine. Now, that argument was getting harder to make; the Amiga too was starting to look a little compromised and inelegant.

Harold Copperman personally introduced the Amiga 3000 in a lavish event — lavish at least by Commodore’s standards — held at New York City’s trendy Palladium nightclub. With CD-ROM in the offing and audiovisual standards improving rapidly across the computer industry, “multimedia” stood with the likes of “hypertext” as one of the great buzzwords of the age. Commodore was all over it, even going so far as to name the event “Multimedia Live!” From Copperman’s address:

It’s our turn. It’s our time. We had the technology four and a half years ago. In fact, we had the product ready for multimedia before multimedia was ready for a product. Today we’re improving the technology, and we’re in the catbird seat. It is our time. It is Commodore’s time.

I’m at Commodore just as multimedia becomes the most important item in the marketplace. Once again I’m with the leader. Of course, in this industry a leader doesn’t have any followers; he just has a lot of other companies trying to pass him by. But take a close look: the other companies are talking multimedia, but they’re not doing it. They’re a long way behind Commodore — not even close.

Multimedia is a first-class way for conveying a message because it takes the strength of the intellectual content and adds the verve — the emotion-grabbing, head-turning, pulse-raising impact that comes from great visuals plus a dynamic soundtrack. For everyone with a message to deliver, it unleashes extraordinary ability. For the businessman, educator, or government manager, it turns any ordinary meeting into an experience.

In a way, this speech was cut from the same cloth as the Amiga 3000 itself. It was certainly a sign of progress, but was it progress enough? Even as he sounded more engaged and more engaging than had plenty of other tepid Commodore executives, Copperman inadvertently pointed out much of what was still wrong with the organization he helmed. He was right that Commodore had had the technology to do multimedia for a long time; as I’ve argued at length elsewhere, the Amiga was in fact the world’s first multimedia personal computer, all the way back in 1985. Still, the obvious question one is left with after reading the first paragraph of the extract above is why, if Commodore had the technology to do multimedia four and a half years ago, they’ve waited until now to tell anyone about it. In short, why is the the world of 1990 “ready” for multimedia when the world of 1985 wasn’t? Contrary to Copperman’s claim about being a leader, Commodore’s own management had begun to evince an understanding of what the Amiga was and what made it special only after other companies had started building computers similar to it. Real business leaders don’t wait around for the world to decide it’s ready for their products; they make products the world doesn’t yet know it needs, then tell it why it needs them. Five years after being gifted with the Amiga, which stands alongside the Macintosh as one of the two most visionary computers of the 1980s precisely because of its embrace of multimedia, Commodore managed at this event to give every impression that they were the multimedia bandwagon jumpers.

The Amiga 3000 didn’t turn into the game changer the faithful were always dreaming of. It sold moderately, mostly to the established Amiga hardcore, but had little obvious effect on the platform’s overall marketplace position. Harold Copperman was blamed for the disappointment, and was duly fired by Irving Gould, the principal shareholder and ultimate authority at Commodore, at the beginning of 1991. The new company line became an exact inversion of that which had held sway at the time of the Amiga 3000’s introduction: Copperman’s expertise was business computing, but Commodore’s future lay in consumer computing. Jim Dionne, head of Commodore’s Canadian division and supposedly an expert consumer marketer, was brought in to replace him.

An old joke began to make the rounds of the company once again. A new executive arrives at his desk at Commodore and finds three envelopes in the drawer, each labelled “open in case of emergency” and numbered one, two, and three. When the company gets into trouble for the first time on his watch, he opens the first envelope. Inside is a note: “Blame your predecessor.” So he does, and that saves his bacon for a while, but then things go south again. He opens the second envelope: “Blame your vice-presidents.” So he does, and gets another lease on life, but of course it only lasts a little while. He opens the third envelope. “Prepare three envelopes…” he begins to read.

Yet anyone who happened to be looking closely might have observed that the firing of Copperman represented something more than the usual shuffling of the deck chairs on the S.S. Commodore. Upon his promotion, it was made clear to Jim Dionne that he was to be held on a much shorter leash than his predecessors, his authority carefully circumscribed. Filling the power vacuum was one Mehdi Ali, a lawyer and finance guy who had come to Commodore a couple of years before as a consultant and had since insinuated himself more and more with Irving Gould. Now he advanced to the title of president of Commodore International, Gould’s right-hand man in running the global organization; indeed, he seemed to be calling far more shots these days than his globe-trotting boss, who never seemed to be around when you needed him anyway. Ali’s rise would not prove a happy event for anyone who cared about the long-term health of the company.

For now, though, the full import of the changes in Commodore’s management structure was far from clear. Amiga users were on to the next Great White Hope, one that in fact had already been hinted at in the Palladium as the Amiga 3000 was being introduced. Once more “multimedia” would be the buzzword, but this time the focus would go back to the American consumer market Commodore had repeatedly failed to capture with the Amiga 500. The clue had been there in a seemingly innocuous, almost throwaway line from the speech delivered to the Palladium crowd by C. Lloyd Mahaffrey, Commodore’s director of marketing: “While professional users comprise the majority of the multimedia-related markets today, future plans call for penetration into the consumer market as home users begin to discover the benefits of multimedia.”

Commodore’s management, (proud?) owners of the world’s first multimedia personal computer, had for most of the latter 1980s been conspicuous by their complete disinterest in their industry’s initial forays into CD-ROM, the storage medium that, along with the graphics and sound hardware the Amiga already possessed, could have been the crowning piece of the platform’s multimedia edifice. The disinterest persisted in spite of the subtle and eventually blatant hints that were being dropped by people like Cinemaware’s Bob Jacob, whose pioneering “interactive movies” were screaming to be liberated from the constraints of 880 K floppy disks.

In 1989, a tiny piece of Commodore’s small engineering staff — described as “mavericks” by at least one source — resolved to take matters into their own hands, mating an Amiga with a CD-ROM drive and preparing a few demos designed to convince their managers of the potential that was being missed. Management was indeed convinced by the demo — but convinced to go in a radically different direction from that of simply making a CD-ROM drive that could be plugged into existing Amigas.

The Dutch electronics giant Philips had been struggling for what seemed like forever to finish something they envisioned as a whole new category of consumer electronics: a set-top box for the consumption of interactive multimedia content on CD. They called it CD-I, and it was already very, very late. Originally projected for release in time for the Christmas of 1987, its constant delays had left half the entertainment-software industry, who had invested heavily in the platform, in limbo on the whole subject of CD-ROM. What if Commodore could steal Phillips’s thunder by combining a CD-ROM drive with the audiovisually capable Amiga architecture not in a desktop computer but in a set-top box of their own? This could be the magic bullet they’d been looking for, the long-awaited replacement for the Commodore 64 in American living rooms.

The industry’s fixation on these CD-ROM set-top boxes — a fixation which was hardly confined to Phillips and Commodore alone — perhaps requires a bit of explanation. One thing these gadgets were not, at least if you listened to the voices promoting them, was game consoles. The set-top boxes could be used for many purposes, from displaying multimedia encyclopedias to playing music CDs. And even when they were used for pure interactive entertainment, it would be, at least potentially, adult entertainment (a term that was generally not meant in the pornographic sense, although some were already muttering about the possibilities that lurked therein as well). This was part and parcel of a vision that came to dominate much of digital entertainment between about 1989 and 1994: that of a sort of grand bargain between Northern and Southern California, a melding of the new interactive technologies coming out of Silicon Valley with the movie-making machine of Hollywood. Much of television viewing, so went the argument, would become interactive, the VCR replaced with the multimedia set-top box.

In light of all this conventional wisdom, Commodore’s determination to enter the fray — effectively to finish the job that Phillips couldn’t seem to — can all too easily be seen as just another example of the me-too-ism that had clung to their earlier multimedia pronouncements. At the time, though, the project was exciting enough that Commodore was able to lure quite a number of prominent names to work with them on it. Carl Sassenrath, who had designed the core of the original AmigaOS — including its revolutionary multitasking capability — signed on again to adapt his work to the needs of a set-top box. (“In many ways, it was what we had originally dreamed for the Amiga,” he would later say of the project, a telling quote indeed.) Jim Sachs, still the most famous of Amiga artists thanks to his work on Cinemaware’s Defender of the Crown, agreed to design the look of the user interface. Reichart von Wolfsheild and Leo Schwab, both well-known Amiga developers, also joined. And for the role of marketing evangelist Commodore hired none other than Nolan Bushnell, the founder almost two decades before of Atari, the very first company to place interactive entertainment in American living rooms. The project as a whole was placed in the capable hands of Gail Wellington, known throughout the Amiga community as the only Commodore manager with a dollop of sense. The gadget itself came to be called CDTV — an acronym, Commodore would later claim in a part of the sales pitch that fooled no one, for “Commodore Dynamic Total Vision.”

Nolan Bushnell, Mr. Atari himself, plugs CDTV at a trade show.

Commodore announced CDTV at the Summer Consumer Electronics Show in June of 1990, inviting selected attendees to visit a back room and witness a small black box, looking for all the world like a VCR or a stereo component, running some simple demos. From the beginning, they worked hard to disassociate the product from the Amiga and, indeed, from computers in general. The word “Amiga” appeared nowhere on the hardware or anywhere on the packaging, and if all went according to plan CDTV would be sold next to televisions and stereos in department stores, not in computer shops. Commodore pointed out that everything from refrigerators to automobiles contained microprocessors these days, but no one called those things computers. Why should CDTV be any different? It required no monitor, instead hooking up to the family television set. It neither included nor required a keyboard — much industry research had supposedly proved that non-computer users feared keyboards more than anything else — nor even a mouse, being controlled entirely through a remote control that looked pretty much like any other specimen of same one might find between the cushions of a modern sofa. “If you know how to change TV channels,” said a spokesman, “you can take full advantage of CDTV.” It would be available, Commodore claimed, before the Christmas of 1990, which should be well before CD-I despite the latter’s monumental head start.

That timeline sounded overoptimistic even when it was first announced, and few were surprised to see the launch date slip into 1991. But the extra time did allow a surprising number of developers to jump aboard the CDTV train. Commodore had never been good at developer relations, and weren’t terribly good at it now; developers complained that the tools Commodore provided were always late and inadequate and that help with technical problems wasn’t easy to come by, while financial help was predictably nonexistent. Still, lots of CD-I projects had been left in limbo by Phillips’s dithering and were attractive targets for adaptation to CDTV, while the new platform’s Amiga underpinnings made it fairly simple to port over extant Amiga games like SimCity and Battle Chess. By early 1991, Commodore could point to about fifty officially announced CDTV titles, among them products from such heavy hitters as Grolier, Disney, Guinness (the publisher, not the beer company), Lucasfilm, and Sierra. This relatively long list of CDTV developers certainly seemed a good sign, even if not all of the products they proposed to create looked likely to be all that exciting, or perhaps even all that good. Plenty of platforms, including the original Amiga, had launched with much less.

While the world — or at least the Amiga world — held its collective breath waiting for CDTV’s debut, the charismatic Nolan Bushnell did what he had been hired to do: evangelize like crazy. “What we are really trying to do is make multimedia a reality, and I think we’ve done that,” he said. The hyperbole was flying thick and fast from all quarters. “This will change forever the way we communicate, learn, and entertain,” said Irving Gould. Not to be outdone, Bushnell noted that “books were great in their day, but books right now don’t cut it. They’re obsolete.” (Really, why was everyone so determined to declare the death of the book during this period?)

CDTV being introduced at the 1991 World of Amiga show. Doing the introducing is Gail Wellington, head of the CDTV project and one of the unsung heroes of Commodore.

The first finished CDTV units showed up at the World of Amiga show in New York City in April of 1991; Commodore sold their first 350 to the Amiga faithful there. A staggered roll-out followed: to five major American cities, Canada, and the Commodore stronghold of Britain in May; to France, Germany, and Italy in the summer; to the rest of the United States in time for Christmas. With CD-I now four years late, CDTV thus became the first CD-ROM-based set-top box you could actually go out and buy. Doing so would set you back just under $1000.

The Amiga community, despite being less than thrilled by the excision of all mention of their platform’s name from the product, greeted the launch with the same enthusiasm they had lavished on the Amiga 3000, their Great White Hope of the previous year, or for that matter the big Christmas marketing campaign of 1989. Amazing Computing spoke with bated breath of CDTV becoming the “standard for interactive multimedia consumer hardware.”

“Yes, but what is it for?” These prospective customers’ confusion is almost palpable.

Alas, there followed a movie we’ve already seen many times. Commodore’s marketing was ham-handed as usual, declaring CDTV “nothing short of revolutionary” but failing to describe in clear, comprehensible terms why anyone who was more interested in relaxing on the sofa than fomenting revolutions might actually want one. The determination to disassociate CDTV from the scary world of computers was so complete that the computer magazines weren’t even allowed advance models; Amiga Format, the biggest Amiga magazine in Britain at the time with a circulation of more than 160,000, could only manage to secure their preview unit by making a side deal with a CDTV developer. CDTV units were instead sent to stereo magazines, who shrugged their shoulders at this weird thing this weird computer company had sent them and returned to reviewing the latest conventional CD players. Nolan Bushnell, the alleged marketing genius who was supposed to be CDTV’s ace in the hole, talked a hyperbolic game at the trade shows but seemed otherwise disengaged, happy just to show up and give his speeches and pocket his fat paychecks. One could almost suspect — perish the thought! — that he had only taken this gig for the money.

In the face of all this, CDTV struggled mightily to make any headway at all. When CD-I hit the market just before Christmas, boasting more impressive hardware than CDTV for roughly the same price, it only made the hill that much steeper. Commodore now had a rival in a market category whose very existence consumers still obstinately refused to recognize. As an established maker of consumer electronics in good standing with the major retailers — something Commodore hadn’t been since the heyday of the Commodore 64 — Phillips had lots of advantages in trying to flog their particular white elephant, not to mention an advertising budget their rival could only dream of. CD-I was soon everywhere, on store shelves and in the pages of the glossy lifestyle magazines, while CDTV was almost nowhere. Commodore did what they could, cutting the list price of CDTV to less than $800 and bundling with it The New Grolier Encyclopedia and the smash Amiga game Lemmings. It didn’t help. After an ugly Christmas season, Nolan Bushnell and the other big names all deserted the sinking ship.

Even leaving aside the difficulties inherent in trying to introduce people to an entirely new category of consumer electronics — difficulties that were only magnified by Commodore’s longstanding marketing ineptitude — CDTV had always been problematic in ways that had been all too easy for the true believers to overlook. It was clunky in comparison to CD-I, with a remote control that felt awkward to use, especially for games, and a drive which required that the discs first be placed into an external holder before being loaded into the unit proper. More fundamentally, the very re-purposing of old Amiga technology that had allowed it to beat CD-I to market made it an even more limited platform than its rival for running the sophisticated adult entertainments it was supposed to have enabled. Much of the delay in getting CD-I to market had been the product of a long struggle to find a way of doing video playback with some sort of reasonable fidelity. Even the released CD-I performed far from ideally in this area, but it did better than CDTV, which at best — at best, mind you — might be able to fill about a third of the television screen with low-resolution video running at a choppy twelve frames per second. It was going to be hard to facilitate a union of Silicon Valley and Hollywood with technology like that.

None of CDTV’s problems were the fault of the people who had created it, who had, like so many Commodore engineers before and after them, been asked to pull off a miracle on a shoestring. They had managed to create, if not quite a miracle, something that worked far better than it had a right to. It just wasn’t quite good enough to overcome the marketing issues, the competition from CD-I, and the marketplace confusion engendered by an interactive set-top box that said it wasn’t a game console but definitely wasn’t a home computer either.

CDTV could be outfitted with a number of accessories that turned it into more of a “real” computer. Still, those making software for the system couldn’t count on any of these accessories being present, which served to greatly restrict their products’ scope of possibility.

Which isn’t to say that some groundbreaking work wasn’t done by the developers who took a leap of faith on Commodore — almost always a bad bet in financial terms — and produced software for the platform. CDTV’s early software catalog was actually much more impressive than that of CD-I, whose long gestation had caused so many initially enthusiastic developers to walk away in disgust. The New Grolier Encyclopedia was a true multimedia dictionary; the entry for John F. Kennedy, for example, included not only a textual biography and photos to go along with it but audio excerpts from his most famous speeches. The American Heritage Dictionary also offered images where relevant, along with an audio pronunciation of every single word. American Vista: The Multimedia U.S. Atlas boasted lots of imagery of its own to add flavor to its maps, and could plan a route between any two points in the country at the click of a button. All of these things may sound ordinary today, but in a way that very modern ordinariness is a testament to what pioneering products these really were. They did in fact present an argument that, while others merely talked about the multimedia future, Commodore through CDTV was doing it — imperfectly and clunkily, yes, but one has to start somewhere.

One of the most impressive CDTV titles of all marked the return of one of the Amiga’s most beloved icons. After designing the CDTV’s menu system, the indefatigable Jim Sachs returned to the scene of his most famous creation. Really a remake rather than a sequel, Defender of the Crown II reintroduced many of the additional graphics and additional tactical complexities that had been excised from the original in the name of saving time, pairing them with a full orchestral soundtrack, digitized sound effects, and a narrator to detail the proceedings in the appropriate dulcet English accent. It was, Sachs said, “the game the original Defender of the Crown was meant to be, both in gameplay and graphics.” He did almost all of the work on this elaborate multimedia production all by himself, farming out little more than the aforementioned narration, and Commodore themselves released the game, having acquired the right to do so from the now-defunct Cinemaware at auction. While, as with the original, its long-term play value is perhaps questionable, Defender of the Crown II even today still looks and sounds mouth-wateringly gorgeous.


If any one title on CDTV was impressive enough to sell the machine by itself, this ought to be have been it. Unfortunately, it didn’t appear until well into 1992, by which time CDTV already had the odor of death clinging to it. The very fact that Commodore allowed the game to be billed as the sequel to one so intimately connected to the Amiga’s early days speaks to a marketing change they had instituted to try to breathe some life back into the platform.

The change was born out of an insurrection staged by Commodore’s United Kingdom branch, who always seemed to be about five steps ahead of the home office in any area you cared to name. Kelly Sumner, managing director of Commodore UK:

We weren’t involved in any of the development of CDTV technology; that was all done in America. We were taking the lead from the corporate company. And there was a concrete stance of “this is how you promote it, this is the way forward, don’t do this, don’t do that.” So, that’s what we did.

But after six or eight months we basically turned around and said, “You don’t know what you’re talking about. It ain’t going to go anywhere, and if it does go anywhere you’re going to have to spend so much money that it isn’t worth doing. So, we’re going to call it the Amiga CDTV, we’re going to produce a package with disk drives and such like, and we’re going to promote it like that. People can understand that, and you don’t have to spend so much money.”

True to their word, Commodore UK put together what they called “The Multimedia Home Computer Pack,” combining a CDTV unit with a keyboard, a mouse, an external disk drive, and the software necessary to use it as a conventional Amiga as well as a multimedia appliance — all for just £100 more than a CDTV unit alone. Commodore’s American operation grudgingly followed their lead, allowing the word “Amiga” to creep back into their presentations and advertising copy.

Very late in the day, Commodore finally began acknowledging and even celebrating CDTV’s Amigahood.

But it was too late — and not only for CDTV but in another sense for the Amiga platform itself. The great hidden cost of the CDTV disappointment was the damage it did to the prospects for CD-ROM on the Amiga proper. Commodore had been so determined to position CDTV as its own thing that they had rejected the possibility of equipping Amiga computers as well with CD-ROM drives, despite the pleas of software developers and everyday customers alike. A CD-ROM drive wasn’t officially mated to the world’s first multimedia personal computer until the fall of 1992, when, with CDTV now all but left for dead, Commodore finally started shipping an external drive that made it possible to run most CDTV software, as well as CD-based software designed specifically for Amiga computers, on an Amiga 500. Even then, Commodore provided no official CD-ROM solution for Amiga 2000 and 3000 owners, forcing them to cobble together third-party adapters that could interface with drives designed for the Macintosh. The people who owned the high-end Amiga models, of course, were the ones working in the very cutting-edge fields that cried out for CD-ROM.

It’s difficult to overstate the amount of damage the Amiga’s absence from the CD-ROM party, the hottest ticket in computing at the time, did to the platform’s prospects. It single-handedly gave the lie to every word in Harold Copperman’s 1990 speech about Commodore being “the leaders in multimedia.” Many of the most vibrant Amiga developers were forced to shift to the Macintosh or another platform by the lack of CD-ROM support. Of all Commodore’s failures, this one must loom among the largest. They allowed the Macintosh to become the platform most associated with the new era of CD-ROM-enabled multimedia computing without even bothering to contest the territory. The war was over before Commodore even realized a war was on.

Commodore’s feeble last gasp in terms of marketing CDTV positioned it as essentially an accessory to desktop Amigas, a “low-cost delivery system for multimedia” targeted at business and government rather than living rooms. The idea was that you could create presentations on Amiga computers, send them off to be mastered onto CD, then drag the CDTV along to board meetings or planning councils to show them off. In that spirit, a CDTV unit was reduced to a free toss-in if you bought an Amiga 3000 — two slow-selling products that deserved one another.

The final verdict on CDTV is about as ugly as they come: less than 30,000 sold worldwide in some eighteen months of trying; less than 10,000 sold in the American market Commodore so desperately wanted to break back into, and many or most of those sold at fire-sale discounts after the platform’s fate was clear. In other words, the 350 CDTV units that had been sold to the faithful at that first ebullient World of Amiga show made up an alarmingly high percentage of all the CDTV units that would ever sell. (Phillips, by contrast, would eventually manage to move about 1 million CD-I units over the course of about seven years of trying.)

The picture I’ve painted of the state of Commodore thus far is a fairly bleak one. Yet that bleakness wasn’t really reflected in the company’s bottom line during the first couple of years of the 1990s. For all the trouble Commodore had breaking new products in North America and elsewhere, their legacy products were still a force to be reckoned with outside the United States. Here the end of the Cold War and subsequent lifting of the Iron Curtain proved a boon. The newly liberated peoples of Eastern Europe were eager to get their hands on Western computers and computer games, but had little money to spend on them. The venerable old Commodore 64, pulling along behind it that rich catalog of thousands upon thousands of games of all stripes, was the perfect machine for these emerging markets. Effectively dead in North America and trending that way in Western Europe, it now enjoyed a new lease on life in the former Soviet sphere, its sales numbers suddenly climbing sharply again instead of falling. The Commodore 64 was, it seemed, the cockroach of computers; you just couldn’t kill it. Not that Commodore wanted to: they would happily bank every dollar their most famous creation could still earn them. Meanwhile the Amiga 500 was selling better than ever in Western Europe, where it was now the most popular single gaming platform of all, and Commodore happily banked those profits as well.

Commodore’s stock even enjoyed a brief-lived bubble of sorts. In the spring and early summer of 1991, with sales strong all over Europe and CDTV poised to hit the scene, the stock price soared past $20, stratospheric heights by Commodore’s recent standards. This being Commodore, the stock collapsed below $10 again just as quickly — but, hey, it was nice while it lasted. In the fiscal year ending on June 30, 1991, worldwide sales topped the magical $1 billion mark, another height that had last been seen in the heyday of the Commodore 64. Commodore was now the second most popular maker of personal computers in Europe, with a market share of 12.4 percent, just slightly behind IBM’s 12.7 percent. The Amiga was now selling at a clip of 1 million machines per year, which would bring the total installed base to 4.5 million by the end of 1992. Of that total, 3.5 million were in Europe: 1.3 million in Germany, 1.2 million in Britain, 600,000 in Italy, 250,000 in France, 80,000 in Scandinavia. (Ironically in light of the machine’s Spanish name, one of the few places in Western Europe where it never did well at all was Spain.) To celebrate their European success, Irving Gould and Mehdi Ali took home salaries in 1991 of $1.75 million and $2.4 million respectively, the latter figure $400,000 more than the chairman of IBM, a company fifty times Commodore’s size, was earning.

But it wasn’t hard to see that Commodore, in relying on all of these legacy products sold in foreign markets, was living on borrowed time. Even in Europe, MS-DOS was beginning to slowly creep up on the Amiga as a gaming platform by 1992, while Nintendo and Sega, the two big Japanese console makers, were finally starting to take notice of this virgin territory after having ignored it for so long. While Amiga sales in Europe in 1992 remained blessedly steady, sales of the Amiga in North America were down as usual, sales of the Commodore 64 in Eastern Europe fell off thanks to economic chaos in the region, and sales of Commodore’s line of commodity PC clones cratered so badly that they pulled out of that market entirely. It all added up to a bottom line of about $900 million in total earnings for the fiscal year ending on June 30, 1992. The company was still profitable, but considerably less so than it had been the year before. Everyone was now looking forward to 1993 with more than a little trepidation.

Even as Commodore faced an uncertain future, they could at least take comfort that their arch-enemy Atari was having a much worse time of it. In the very early 1990s, Atari enjoyed some success, if not as much as they had hoped, with their Lynx handheld game console, a more upscale rival to the Nintendo Game Boy. The Atari Portfolio, a genuinely groundbreaking palmtop computer, also did fairly well for them, if perhaps not quite as well as it deserved. But the story of their flagship computing platform, the Atari ST, was less happy. Already all but dead in the United States, the ST’s market share in Europe shrank in proportion to the Amiga’s increasing sales, such that it fell from second to third most popular gaming computer in 1991, trailing MS-DOS now as well as the Amiga.

Atari tried to remedy the slowing sales with new machines they called the STe line, which increased the color palette to 4096 shades and added a blitter chip to aid onscreen animation. (The delighted Amiga zealots at Amazing Computing wrote of these Amiga-inspired developments that they reminded them of “an Amiga 500 created by a primitive tribe that had never actually seen an Amiga, but had heard reports from missionaries of what the Amiga could do.”) But the new hardware broke compatibility with much existing software, and it only got harder to justify buying an STe instead of an Amiga 500 as the latter’s price slowly fell. Atari’s total sales in 1991 were just $285 million, down by some 30 percent from the previous year and barely a quarter of the numbers Commodore was doing. Jack Tramiel and his sons kept their heads above water only by selling off pieces of the company, such as the Taiwanese manufacturing facility that went for $40.9 million that year. You didn’t have to be an expert in the computer business to understand how unsustainable that path was. In the second quarter of 1992, Atari posted a loss of $39.8 million on sales of just $23.3 million, a rather remarkable feat in itself. Whatever else lay in store for Commodore and the Amiga, they had apparently buried old Mr. “Business is War.”

Still, this was no time to bask in the glow of sweet revenge. The question of where Commodore and the Amiga went from here was being asked with increasing urgency in 1992, and for very good reason. The answer would arrive in the latter half of the year, in the form at long last of the real, fundamental technical improvements the Amiga community had been begging for for so long. But had Commodore done enough, and had they done it in time to make a difference? Those questions loomed large as the 68000 Wars were about to enter their final phase.

(Sources: the book On the Edge: The Spectacular Rise and Fall of Commodore by Brian Bagnall; Amazing Computing of August 1987, June 1988, June 1989, July 1989, May 1990, June 1990, July 1990, August 1990, September 1990, December 1990, January 1991 February 1991, March 1991, April 1991, May 1991, June 1991, August 1991, September 1991, November 1991, January 1992, February 1992, March 1992, April 1992, June 1992, July 1992, August 1992, September 1992, November 1992, and December 1992; Info of July/August 1988 and January/February 1989; Amiga Format of July 1991, July 1995, and the 1992 annual; The One of September 1990, May 1991, and December 1991; CU Amiga of June 1992, October 1992, and November 1992; Amiga Computing of April 1992; AmigaWorld of June 1991. Online sources include Matt Barton’s YouTube interview with Jim Sachs,  Sébastien Jeudy’s interview with Carl Sassenrath, Greg Donner’s Workbench Nostalgia, and Atari’s annual reports from 1989, available on archive.org. My huge thanks to reader “himitsu” for pointing me to the last and providing some other useful information on Commodore and Atari’s financials during this period in the comments to a previous article in this series. And thank you to Reichart von Wolfsheild, who took time from his busy schedule to spend a Saturday morning with me looking back on the CDTV project.)

 
 

Tags: , , , , ,

Games on the Mersey, Part 5: The Lemmings Effect

“Mummy, what’s a group of lemmings called?”

“A pact. That’s right, a suicide pact.”

“Mummy, when a lemming dies does it go to heaven like all good little girls?”

“Don’t be soft. They burn in hell — like distress flares.”

(courtesy of the January 1991 issue of The One)


 

If you had looked at the state of Psygnosis in 1990 and tried to decide which of their outside developers would break the mold of beautiful-but-empty action games, you would have no reason to single out DMA Design over any of the others. Certainly Menace and Blood Money, the two games DMA had already created for Psygnosis, gave little sign that any visionaries lurked within their ranks. From their generic teenage-cool titles to their rote gameplay, both games were as typical of Psygnosis as anything else in their catalog.

And yet DMA Design — and particularly their leader, David Jones — did in fact have abilities as yet undreamt of. People have a way of surprising you sometimes. And isn’t that a wonderful thing?


 

There are some interesting parallels between the early days of DMA Design and the early days of Imagine Software, that predecessor to Psygnosis. Like Imagine, DMA was born in a city far from the cultural capitals of Britain — even farther away than Liverpool, in fact, all the way up in Dundee, Scotland.

Dundee as well had traditionally been a working-class town that thrived as a seaport, until the increasing size of merchant ships gradually made the role untenable over the course of the twentieth century. Leaders in Dundee, as in Liverpool, worked earnestly to find new foundations for their city’s economy. And one of these vectors of economic possibility — yet again as in Liverpool — was electronics. Dundee convinced the American company National Cash Register Corporation, better known as NCR, to build their principal manufacturing plant for Britain and much of Europe there as early as 1945, and by the 1960s the city was known throughout Britain as a hub of electronics manufacture. Among the electronics companies that came to Dundee was Timex, the Norwegian/Dutch/American watchmaking conglomerate.

In 1980, Sinclair Research subcontracted out most of the manufacture of their new ZX80 home computer to the Timex plant in Dundee. The relationship continued with the ZX81, and then with Sinclair’s real heavy hitter, the Spectrum. By 1983, Timex was straining to keep up with demand for the little machines, hiring like mad in and around Dundee in order to keep the production lines rolling day and night.

David Jones. (It’s my understanding that the nose is removable.)

One of the people they hired was David Jones, 18 years old and fresh out of an experimental new “computer studies” program that was being pioneered in Dundee. He came to Timex on an apprenticeship, which paid for him to take more computer courses at nearby Kingsway Technical College.

Just as had Bruce Everiss’s Microdigital shop in Liverpool, Kingsway College was fomenting a hacking scene in Dundee, made up largely of working-class youths who but for this golden chance might have had to resign themselves to lives spent sweeping out warehouses. Both students of the college and interested non-students would meet regularly in the common areas to talk shop and, inevitably, to trade pirated games. Jones became one of the informal leaders of the collective. To ensure a steady supply of games for his mates, he even joined a cracking group in the international piracy “scene” who called themselves the Kent Team.

But for the most dedicated of the Kingsway gang, Jones among them, trading and playing games was a diversion rather than the real point. Like the gang as a whole, this hacker hardcore was of disparate temperaments and ages — one of them, named Mike Dailly, was just 14 years old when he started showing up at the college — but they were united by a certain seriousness about computers, by the fact that computers for them were, rather than just a hobby or even a potential means of making a living, an all-consuming passion. Unlike their more dilettantish peers, they were more interested in understanding how the games they copied worked and learning how to make their own than they were in playing them for their own sake.

In 1986, Sinclair sold their entire extant home-computer line to Amstrad, who promptly took the manufacturing of same in-house, leaving Timex out of a huge contract. Most of the Dundee plant’s employees, Jones among them, were laid off as a result. At the urging of his parents, he invested half of his £2000 severance check into a degree course in Computer Science at the Dundee College of Technology (now known as Abertay University). He used the other half to buy a Commodore Amiga.

Brand-new and pricey, the Amiga was still a very exotic piece of kit anywhere in Britain in 1986, much less in far-flung Dundee; Jones may very well have been the first person in his hometown to own one. His new computer made him more popular than ever among the Kingsway hackers. With the help of his best buddies from there and some others he’d met through the Kent Team, his experiments with his new toy gradually coalesced around making a shoot-em-up game that he could hopefully sell to a publisher.

David Jones first met Ian Hetherington and Dave Lawson of Psygnosis in late 1987, when he took his shoot-em-up-in-progress to a Personal Computer World Show to hawk it to potential publishers. Still at that age when the events of last year, much less those of three years ago, seem like ancient history, he had little awareness of their company’s checkered past as Imagine, and less concern about it. “You know, I don’t think I even researched it that well,” he says. “I remember the stories about it, but back in those days everything was moving so quickly, it never even crossed my mind.” Instead he was wowed, like so many developers, by Psygnosis’s cool good looks. The idea of seeing his game gussied up by the likes of Roger Dean was a difficult one to resist. He also liked the fact that Psygnosis was, relatively speaking, close to Dundee: only about half a day by car.

Menace

Psygnosis was perhaps slightly less impressed. They agreed to publish the game, running it through their legendary art department in the process, giving it the requisite Roger Dean cover, and changing the name from the Psygnosis-like Draconia to the still more Psygnosis-like Menace. But they didn’t quite judge it to be up to the standard of their flagship games, publishing it in a smaller box under a new budget line they were starting called Psyclapse (a name shared with the proposed but never-completed second megagame from the Imagine days).

Still, even at a budget price and thus a budget royalty rate, Menace did well enough to make Jones a very happy young man, selling about 20,000 copies in the still emerging European Amiga market. It even got a showcase spot in the United States, when the popular television program Computer Chronicles devoted a rare episode to the Amiga and Commodore’s representative chose Menace as the game to show off alongside Interplay’s hit Battle Chess; host Stewart Cheifet was wowed by the “really hot graphics.” A huge auto buff — another trait he shared with the benighted original founders of Imagine — Jones made enough from the game to buy his first new car. It was a Vauxhall Astra hot hatch rather than a Ferrari, but, hey, you had to start somewhere.

Even before Menace‘s release, he had taken to calling his informal game-development club DMA Design, a name that fit right in with Psygnosis’s effect-obsessed aesthetic: “DMA” in the computer world is short for “Direct Memory Access,” something the Amiga’s custom chips all utilized to generate audiovisual effects without burdening the CPU. (When the glory days of Psygnosis with Amiga tech-heads had passed, Jones would take to saying that the name stood for “Doesn’t Mean Anything.”) In the wake of Menace‘s success, he decided to drop out of college and hang up his shingle in a cramped two-room space owned by his fiancée’s father, located on a quiet street above a baby shop and across the way from a fish-and-chips joint. He moved in in August of 1989, bringing with him many of his old Kingsway College mates on a full-time or part-time basis, as their circumstances and his little company’s income stream dictated.

DMA’s first office, a nondescript place perched above a baby shop in Dundee. That’s artist Gary Timmons peering out the window. It was in this improbable location that the most successful game the British games industry had produced to date was born.

DMA, which still had more the atmosphere of a computer clubhouse than a conventional company, gladly took on whatever projects Psygnosis threw them, including the rather thankless task of converting existing Amiga titles — among them Shadow of the Beast — to more limited, non-Amiga platforms like the Commodore 64. Meanwhile Jones shepherded to completion their second original game, another typically Psygnosisian confection called Blood Money. This game came out as a full-price title, a sign that they were working their way up through the ranks. Even better, it sold twice as many copies as had Menace.

So, they charged ahead on yet another game cut from the same cloth. Just about everything you need to know about their plans for Gore is contained in the name. Should you insist on more information, consider this sample from a magazine preview of the work-in-progress: “Slicing an adversary’s neck in two doesn’t necessarily guarantee its defeat. There’s a chance the decapitated head will sprout wings and fly right back at you!” But then, in the midst of work on that charming creation, fate intervened to change everything forever for DMA and Psygnosis alike.

The process that would lead to the most popular computer game the nation of Britain had yet produced had actually begun months before, in fact within days of the DMA crew moving into their new clubhouse. It all began with an argument.

Blood Money was in the final stages of development at the time — it would be released before the end of 1989 — and, having not yet settled on making Gore, Jones and company had been toying with ideas for a potential sequel they tentatively called Walker, based around one of the characters in Blood Money, a war robot obviously inspired by the Imperial walkers in Star Wars: The Empire Strikes Back. Scott Johnson, an artist whom DMA had recently rescued from a life of servitude behind the counter of the local McDonald’s, said that the little men which the walker in the new game would shoot with its laser gun and crush beneath its feet would need to be at least 16 by 16 pixels in size to look decent. Mike Dailly, the Kingsway club veteran, disagreed, and set about trying to prove himself right by drawing them inside a box of just 8 by 8 pixels.

But once he got started drawing little men in Deluxe Paint, he just couldn’t stop. (Deluxe Paint did always tend to have that effect on people; not for nothing did artist George Christensen once call it “the greatest videogame ever designed.”) Just for fun, he added a ten-ton weight out of a Coyote-and-Road-Runner cartoon, which crushed the little fellows as they walked beneath it. Johnson soon jumped back into the fray as well, and the two wound up creating a screen-full of animations, mostly showing the little characters coming to various unhappy ends. Almost forgotten by the end of the day was the fact that Dailly had proved his point: a size of 8 by 8 pixels was enough.

This screen of Deluxe Paint animations, thrown together on a lark one afternoon by a couple of bored young men, would spawn a franchise which sold 15 million games.

It was another member of the DMA club, Russell Kay, who looked at the animations and spoke the fateful words: “There’s a game in that!” He took to calling the little men lemmings. It was something about the way they always seemed to be moving in lockstep and in great quantities across the screen whenever they showed up — and the way they were always blundering into whatever forms of imaginative deaths their illustrators could conjure up. The new name resulted in the appearance of the little men morphing into something not quite rodent-like but not quite human either, a bunch of vaguely Smurf-like fellows with bulbous noses and a cuteness people all over the world would soon find irresistible — even as they watched them die horribly by the dozens.

The very first lemmings

 


Did you know?

James Bond author Ian Fleming’s real name was in fact Ian Frank Lemming. It doesn’t take a genius to see how easily a mistake was made.

Success for Jane Lemming was always on the cards, but it took a change of name and hairstyle to become a top television personality — better known as Jan Leeming.

“One day you’ll be a big movie star,” someone once told leading Hollywood heartthrob Jack Lemmon (real name Jack Lemming). And he is.

(courtesy of the January 1991 issue of The One)

The actuality of real-world lemmings is in fact very different from the myth of little rodent soldiers following one another blindly off the side of a cliff. The myth, which may first have shown up in print in Cyril Kornbluth’s 1951 science-fiction story “The Marching Morons,” was popularized by Disney later in the decade, first in their cartoons and later in a purported nature documentary called White Wilderness, whose makers herded the poor creatures off a cliff while the cameras rolled.

While it has little to no basis in reality, the myth can be a difficult proposition to resist in terms of metaphor. Fans of Infocom’s interactive fiction may remember lemmings showing up in Trinity, where the parallel between lemmings marching off a cliff and Cold War nuclear brinkmanship becomes a part of author Brian Moriarty’s rich symbolic language. It’s safe to say, though, that no similarly rarefied thoughts about the creatures were swirling around DMA Design. They just thought they were really, really funny.


 

Despite Russell Kay’s prophetic comment, no one at DMA Design was initially in a hurry to do anything more with their lemmings than amuse themselves. For some months, the lemmings were merely a joke that was passed around DMA’s circle in Dundee in the form of pictures and animations, showing them doing various ridiculous things and getting killed in various hilarious ways.

In time, though, Jones found himself at something of an impasse with the Gore project. He had decided he simply couldn’t show all the gore he wanted to on a 512 K Amiga, but Psygnosis was extremely reluctant, despite the example of other recent releases like Dungeon Master, to let him make a game that required 1 MB of memory. He decided to shelve Gore for a while, to let the market catch up to his ambitions for it. In the meantime, he turned, almost reluctantly, to those lemmings that were still crawling all over the office, to see if there was indeed a game there. The answer, of course, would prove to be a resounding yes. There was one hell of a game there, one great enough to ensure that Gore would never cross anyone’s mind again.

The game which Jones started to program in June of 1990 was in one sense a natural evolution of the animations, so often featuring long lines of marching lemmings, that his colleagues had been creating in recent months. In another, though, it was a dramatic break from anything DMA — or, for that matter, Psygnosis — had done before, evolving into a constructive puzzle game rather than another destructive action game.

That said, reflexes and timing and even a certain amount of destruction certainly have their roles to play as well. Lemmings is a level-based game. The little fellows — up to 100 of them in all — pour out of a chute and march mindlessly across the level’s terrain from left to right. They’ll happily walk off cliffs or into vats of water or acid, or straight into whatever other traps the level contains. They’ll turn around and march in the other direction only if they encounter a wall or other solid barrier — and, once they start going from right to left instead of left to right, they won’t stop until they’re dead or they’re forced to turn around yet again. Your task is to alter their behavior just enough to get as many of them as possible safely to the level’s exit. Doing so often requires sacrificing some of them for the greater good.

To meet your goal, you have a limited but surprisingly flexible palette of possibilities at your disposal. You can change the behavior of individual lemmings by assigning them one or both of two special abilities, and/or by telling them to perform one of six special actions. The special abilities include making a lemming a “climber,” able to crawl up sheer vertical surfaces like a sort of inchworm; and making a lemming a “floater,” equipped with an umbrella-cum-parachute which will let him fall any distance without harm. The special actions include telling a lemming to blow himself up (!), possibly damaging the terrain around him in the process; turning him into a “blocker,” standing in one place and forcing any other lemmings who bump into him to turn around and march in the other direction; having him build a section of bridge — or, perhaps better said, of an upward-angling ramp; or having him dig in any of three directions: horizontally, diagonally, or vertically. Each level lets you use each special ability and each special action only a limited number of times. These restrictions are key to much of the challenge of the later levels; advanced Lemmings players become all too familiar with the frustration of winding up short by that one bridge-builder or digger, and having to start over with a completely new approach because of it.

In addition to the tools at your disposal which apply to individual lemmings, you also have a couple of more universal tools to hand. You can control the rate at which lemmings pour out of the entrance chute — although you can’t slow them down below a level’s starting speed — and you can pause the game to take a breather and position your cursor just right. The later levels require you to take advantage of both of these abilities, not least because each level has a time limit.


A Four-Screenshot Introduction to Lemmings

We’re about to begin one of the early levels — the first on the second difficulty level, or 31st out of 120 in all. We see the level’s name, the number of lemmings with which we have to deal, the number we’re required to save, their “release rate” — how quickly they fall out of the chute and into the world — and how much time we have.

Now we’ve started the level proper. We need to build a bridge to cross this gap. To keep the other lemmings from rushing after our slow-working bridge-builder and falling into the abyss, we’ve turned the one just behind him into a blocker. When we’re ready to let the lemmings all march onward, we can tell the blocker to blow himself up, thus clearing the way again. Note our toolbar at the bottom of the screen, including the count of how many of each type of action we have left.

With the gap safely bridged, we turn our lead lemming into a horizontal digger — or “basher” in the game’s preferred nomenclature — to get through the outcropping.

There were no other barriers between our lead lemming and the exit. So, we’ve blown up our blocker to release the hounds — er, lemmings — and now watch them stream toward the exit. We’ve only lost one lemming in total, that one being our poor blocker — a 99-percent survival rate on a level that only required us to save 50 percent. But don’t get too cocky; the levels will soon start getting much, much harder.


 

Setting aside design considerations for the moment, Lemmings is nothing short of an amazing feat in purely technical terms. Its levels, which usually sprawl over the width of several screens, consist entirely of deformable terrain. In other words, you can, assuming you have the actions at your disposal, dig wherever you want, build bridges wherever you want, etc., and the lemmings will interact with the changed terrain just as you would expect. To have the computer not just paint a landscape onto the screen but to be aware of and reactive to the potentially changing contents of every single pixel was remarkable in the game’s day. And then to move up to 100 independent lemmings in real time, while also being responsive to the player’s inputs… Lemmings is a program any hacker would be thrilled to claim.

As soon as he had the basic engine up and running, David Jones visited Psygnosis with it and “four or eight” levels in tow, to show them what he was now planning to turn into DMA’s next game. Jones:

They were a big company, probably about thirty or forty people. I said, “I’ll just go out to lunch, but what I’ll do is I’ll just leave the demo with a bunch of you here — grab it, play it, see what you think.” I remember coming back from lunch and it was on every single machine in the office. And everybody was just really, really enjoying it. At that time I thought, “Well, we have something really special here.”

That reaction would become typical. In an earlier article, I wrote about a Tetris Effect, meant to describe the way that game took over lives and destroyed productivity wherever it went. We might just as well coin the term “Lemmings Effect” now to describe a similar phenomenon. Part and parcel of the Lemmings Effect were all the terrible jokes: “What do lemmings drink?” “Lemmingade!” “What’s their favorite dessert?” “Lemming meringue pie!”

Following its release, Lemmings would be widely greeted as an immaculate creation, a stroke of genius with no antecedents. Our old friend Bruce Everiss, not someone inclined to praise anything Ian Hetherington was involved in without ample justification, nevertheless expressed the sentiment in his inimitably hyperbolic way in 1995:

In the worlds of novels and cinema, it is recognised that there are only a small number of plots in the universe. Each new book or film takes one of these plots and interprets it in a different way.

So it is in computer games. Every new title has been seen in many different guises; it is merely the execution that is new. The Amiga unleashed new levels of sound, graphics, and computer power on the home market. Software titles utilised these capabilities in some amazing packages, but they were all just re-formulations of what had gone before.

Until Lemmings. DMA Design created a totally new concept. In computer games this is less common than rocking-horse manure. Not only was the concept of Lemmings completely new, but also the execution was exemplary, displaying the Amiga’s capabilities well.

Lemmings is indeed a shockingly original game, made all the more shocking by coming from a developer and a publisher that had heretofore given one so little reason to anticipate originality. Still, if we join some of the lemmings in digging a bit beneath the surface, we can in fact see a source for some of the ideas that went into it.

David Jones and his colleagues were fanatic devotees of Peter Molyneux’s Populous before and during their work on LemmingsPopulous, like Lemmings, demands that you control a diffuse mass of actors through somewhat indirect means, by manipulating the environment and changing the behavior of certain individuals in the mass. Indeed, Lemmings has a surprising amount in common with Populous, even as the former is a puzzle game of rodent rescue and the latter a strategy game of Medieval warfare. Jones and company went so far as to add a two-player mode to Lemmings in homage to the Populous tournaments that filled many an evening spent in the clubhouse above the baby shop. (In the case of Lemmings, however, the two-player mode wouldn’t prove terribly popular, not least because it required a single Amiga equipped with two mice; unique and entertaining, it’s also largely forgotten today, having been left out of the sequels and most of the ports.)

Although it’s seldom if ever described using the name, Lemmings thus fits into a group of so-called “god games” that were coming to the fore at the turn of the decade; in addition to Populous, the other famous exemplar from the time is Will Wright’s SimCity. More broadly, it also fits into a longstanding British tradition of spatial puzzle games, as exemplified by titles like The Sentinel.

The Lemmings level editor

But one area where David Jones and company wisely departed from the model of Populous and The Sentinel was in building all of the levels in Lemmings by hand. British programmers had always had a huge fondness for procedural generation, which suited both the more limited hardware they had to work with in comparison to their American peers and the smaller teams in which they generally worked. Jones bucked that precedent by building a level editor for Lemmings as soon as he had the game engine itself working reasonably well. The gang in and around the clubhouse all spent time with the level editor, coming up with devious creations. Once a week or so, they would vote on the best of them, then upload these to Psygnosis.

Belying their reputation for favoring style over substance, Psygnosis had fallen in love with the game of Lemmings from the day of Jones’s first visit with his early demo in tow. It had won over virtually everyone who worked there, gamer and non-gamer alike. That fact became a key advantage for the work-in-progress. Everyone in Liverpool would pile on to play the latest levels as they were sent over from Dundee, faxing back feedback on which ones should make the cut and how those that did could be made even more interesting. As the game accelerated toward completion, Jones started offering a £10 bounty for every level that passed muster with Psygnosis, leading to yet more frenzied activity in both Dundee and Liverpool. Almost accidentally, DMA and Psygnosis had hit upon a way of ensuring that Lemmings would get many times the play-testing of the typical game of its era, all born of the fact that everyone was dying to play it — and dying to talk about playing it, dying to explain how the levels could be made even better. The results would show in the finished product. Without the 120 lovingly handcrafted levels that shipped with the finished game, Lemmings would have been an incredible programming feat and perhaps an enjoyable diversion, but it could never have been the sensation it became.

Just as the quality of the levels was undoubtedly increased immeasurably by the huge willing testing pool of Psygnosis staffers, their diversity was increased by having so many different personalities making them. Even today, those who were involved in making the game can immediately recognize a level’s author from its design and even from its graphical look. Gary Timmons, a DMA artist, became famed, oddly enough given his day job, for his minimalist levels that featured little but the bare essentials needed to fulfill their functions. Mike Dailly, at the other extreme, loved to make his levels look “pretty,” filling them with colors and patterns that had nothing to do with the gameplay. The quintessential examples of Dailly’s aesthetic were a few special levels which were filled with graphics from the earlier Psygnosis games Shadow of the Beast I and II, Awesome, and DMA’s own Menace.

Awesome meets Lemmings. For some reason, I find the image of all these little primary-colored cartoon lemmings blundering through these menacing teenage-cool landscapes to be about the funniest — and certainly the most subversive — thing in the entire game.

But of course the most important thing is how the levels play — and here Lemmings only rarely disappoints. There are levels which play like action games, demanding perfect clicking and good reflexes above all; there are levels which play like the most cerebral of strategy games, demanding perfect planning followed by methodical execution of the plan. There are levels where you need to shepherd a bare handful of lemmings through an obstacle course of tricks and traps without losing a single one; there are levels where you have 100 lemmings, and will need to kill 90 percent of them in order to get a few traumatized survivors to the exit. There are levels which are brutally compressed, where you have only one minute to succeed or fail; there are levels which you give you fully nine minutes to guide your charges on a long journey across several screens worth of terrain.

One of the most remarkable aspects of Lemmings is the way it takes the time to teach you how to play it. The very first level is called “Just Dig!,” and, indeed, requires nothing more of you. As you continue through the first dozen levels or so, the game gradually introduces you to each of the verbs at your command. Throughout the levels that follow, necessity — that ultimate motivator — will force you to build upon what you already know, learning new tricks and new combinations. But it all begins gently, and the progression from rank beginner to master lemming-herder feels organic. Although the general trajectory of the difficulty is ever upward as you work your way through the levels, there are peaks and valleys along the way, such that a level that you have to struggle with for an hour or two will usually be followed by one or two less daunting challenges.

All of this has since become widely accepted as good design practice, but games in 1990 were very seldom designed like this. Looking for contemporaneous points of comparison, the best I can come up with is a game in a very different genre: the pioneering real-time dungeon-crawler Dungeon Master, which also gently teaches you how to play it interactively, without ever resorting to words, and then slowly ramps up the difficulty until it becomes very difficult indeed. Dungeon Master and Lemmings stand out in gaming history for not only inventing whole new paradigms of play, but for perfecting them in the same fell swoop. Just as it’s difficult to find a real-time dungeon crawler that’s better than Dungeon Master, you won’t find a creature-herding puzzle game that’s better than the original Lemmings without looking long and hard.

If I was to criticize anything in Lemmings, I’d have to point to the last handful of levels. For all its progressive design sensibilities, Lemmings was created in an era when games were expected to be hard, when the consensus view had it that actually beating one ought to be a monumental achievement. In that spirit, DMA pushed the engine — not to mention the player — to the ragged edge and perhaps a little beyond in the final levels. A blogger named Nadia, who took upon herself the daunting task of completing every single level in every single Lemmings game and writing about them all, described the things you need to do to beat many of these final levels as “exploiting weird junk in the game engine.” These levels are “the wrong kind of difficult,” she goes on to say, and I agree. Ah, well… at least the very last level is a solid one that manages to encompass much of what has come before, sending the game out on a fine note.

Here we see one of the problems that dog the final levels. There are 49 lemmings packed together in a tiny space. I have exactly one horizontal dig at my disposal, and need to apply it to a lemming pointing to the right so the group can make its exit, but there’s no possible way to separate one lemming from another in this jumble. So, I’m down to blind luck. If luck isn’t with me — if the lemming I wind up clicking on is pointed in the wrong direction — I have to start the level over through no fault of my own. It will then take considerable time and effort to arrive back at this point and try again. This sort of situation is sometimes called “fake difficulty” — difficulty that arises from the technical limitations of the interface or the game engine rather than purer design considerations. It is, needless to say, not ideal.

To modern ears, taking this ludic masterpiece from nothing to finished in less than a year sounds like an incredible feat. Yet that was actually a fairly long development cycle by the standards of the British games industry of 1990. Certainly Psygnosis’s marketers weren’t entirely happy about its length. Knowing they had something special on their hands, they would have preferred to release it in time for Christmas. Thankfully, better sense prevailed, keeping the game off the market until it was completely ready.

As Lemmings‘s February 1991 release date approached, Psygnosis’s marketers therefore had to content themselves with beating the hype drum for all it was worth. They sent early versions to the magazines, to enlist them in building up the buzz about the game. One by one, the magazines too fell under the thrall of the Lemmings Effect. Amiga Format would later admit that they had greeted the arrival of the first jiffy bag from Psygnosis with little excitement. “Some snubbed it at first,” they wrote, “saying that they didn’t like puzzlers, but in the end the sound of one ‘Oh, no!’ would turn even the most hardened cynic into an addict.”

When a journalist from the American magazine .info visited Liverpool, he went to a meeting where Psygnosis showed the game to their distributors for the first time. The reaction of these jaded veterans of the industry was as telling as had been the Lemmings Effect that had swept through all those disparate magazine offices. “They had to be physically torn away from the computers,” wrote .info, “and crowds of kibitzers gathered to tell the person playing how to do it.” When an eight-level demo version of the game went out on magazine cover disks a few weeks before the launch, the response from the public at large was once again in David Jones’s words “tremendous,” prompting the usually cautious Psygnosis — the lessons of the Imagine days still counted with Ian Hetherington — to commit to an initial pressing that was far larger than they had done for any game before.

And yet it wasn’t anywhere near large enough. When Lemmings was released on Valentine’s Day, 1991, its first-day sales were unprecedented for Psygnosis, who were, for all their carefully cultivated cool, still a small publisher in a big industry. Jones remembers Ian Hetherington phoning him up hourly to report the latest numbers from the distributors: 40,000 sold, 50,000 sold. On its very first day, the game sold out all 60,000 copies Psygnosis had pressed. To put this number in perspective, consider that DMA’s Menace had sold 20,000 copies in all, Blood Money had sold 40,000 copies, and a game which sold 60,000 copies over its lifetime on the Amiga was a huge success by Psygnosis’s usual standards. Lemmings was a success on another level entirely, transforming the lives literally overnight of everyone who had been involved in making it happen. Psygnosis would struggle for months to turn out enough copies to meet the insatiable demand.

Unleashed at last to write about the game which had taken over their offices, the gaming press fell over themselves to praise it; it may have been only February, but there was no doubt what title was destined to be game of the year. The magazine The One felt they needed five pages just to properly cover all its nuances — or, rather, to gush all over them. (“There’s only one problem with Lemmings: it’s too addictive by half. Don’t play it if you have better things to do. You won’t ever get round to doing them.”) ACE openly expressed the shock many were feeling: shock that this hugely playable game could have come out of Psygnosis. It felt as if all of the thinking about design that they could never be bothered to do in the past had now been packed into this one release.

And as went the British Amiga scene, so went Europe and then the world. The game reached American shores within weeks, and was embraced by the much smaller Amiga scene there with the same enthusiasm their European peers had evinced. North American Amiga owners had seen their favored platform, so recently the premiere gaming computer on their continent as it still was in Europe, falling out of favor over the course of the previous year, with cutting-edge releases like Wing Commander appearing first on MS-DOS and only later making their way — and in less impressive versions at that — to the Amiga. Lemmings would go down in history as a somewhat melancholy milestone: as one of the last Amiga games to make American owners of other computers envious.

But then, Psygnosis had no intention of keeping a hit like this one as an Amiga exclusive for very long. Within months, an MS-DOS version was available. Amiga owners didn’t hesitate to point out its failings in comparison to the original — the controls weren’t quite right, they insisted, and the unique two-player mode had been cut out entirely — but the game’s charms were still more than intact enough. It was in MS-DOS form that Lemmings really conquered North America, thus belatedly fulfilling for Ian Hetherington, last man standing at Psygnosis from the Imagine days, the old Imagine dream of becoming a major player on the worldwide software stage. In 1992, the magnitude of their newfound success in North America led Psygnosis to open their first branch office in Boston. The people who worked there spent most of their time answering a hint line set up for the hundreds of thousands — soon, millions, especially after the game made its way to Nintendo consoles — of frazzled Americans who were hopelessly stymied by this or that level.

Britain was the country that came the closest to realizing the oft-repeated ambition of having its game developers treated like rock stars. DMA Design was well-nigh worshiped by Amiga owners in the wake of Lemmings. Here they appear in a poster inserted into a games magazine, ready to be pasted onto a teenage girl’s wall alongside her favorite boy bands. Well, perhaps a really nerdy teenage girl’s wall. Hey, it could happen…

There was talk for some time of polishing up David Jones’s in-house level editor and turning it into a Lemmings Construction Kit, but Psygnosis soon decided that they would rather sell their customers more content in the form of add-on disks than a way of making their own levels. Addicts looking for their next fix could thus get their hands on Oh, No! More Lemmings before the end of 1991, with 100 more levels on offer. This collection had largely been assembled from the cast-offs that hadn’t quite made the cut for the first game, and the reasons why weren’t that hard to sense: these levels were hard, and a little too often in that same cheap way as some of the final levels from the original. Still, it served the purpose, delivering another huge hit. That Christmas, Psygnosis gave away Xmas Lemmings, a free demo disk with a few levels re-skinned for the holiday season. They hit upon a magical synergy in doing so; the goofy little creatures, now dressed in Santa suits, went together perfectly with Christmas, and similar disks would become a tradition for several more holiday seasons to come.

This insertion of Lemmings into the cozy family atmosphere of Christmas is emblematic of the game’s appeal to so many outside the usual hardcore-gamer demographic. Indeed, Lemmings joined Tetris during this period in presaging the post-millennial casual-game market. Perhaps even more so than Tetris, it has most of the important casual traits: it’s eminently approachable, easy to learn, bright and friendly in appearance, and thoroughly cute. Granted, it’s cute in an odd way that doesn’t reward close thinking overmuch: the lemmings are after all exploding when they pipe up with their trademark “Oh, no!,” and playing the game for any length of time entails sending thousands upon thousands of the little buggers to their deaths. And yet cute it somehow manages to be.

One important quality Lemmings shares with Tetris is the way it rewards whatever level of engagement you care to give it. The early levels can be blundered through by anyone looking for a few minutes’ diversion — stories abounded of four-year-olds managing the training levels — but the later ones will challenge absolutely anyone who tackles them. Likewise, the level-based structure means Lemmings can be used to fill a coffee break or a long weekend, as you wish. This willingness to meet players on their own terms is another of the traits of gaming’s so-called “casual revolution” to come. It’s up to you to get off the train wherever interest and dedication dictate.

But those aspects of the Lemmings story — and with them the game’s full historical importance — would be clearly seen only years in the future. In the here and now, DMA had more practical concerns. David Jones abandoned the clubhouse above the baby shop for much larger digs, hired more staff, and went back to the grindstone to deliver a full-fledged sequel. He didn’t, however, neglect to upgrade his lifestyle to match his new circumstances, including the requisite exotic sports car.

Unlike the old guard of Imagine Software, Jones could actually afford his excesses. When all the sales of all the sequels that would eventually be released are combined with those of the original, the total adds up to some 15 million games sold, making Lemmings by far the biggest gaming property ever to be born on the Amiga and the biggest to be born in Britain prior to Grand Theft Auto — a series which, because the success of Lemmings apparently hadn’t been enough for them, the nucleus of DMA Design would later be responsible for as well.

Meanwhile Psygnosis, now The House That Lemmings Built, also found larger offices, in Liverpool’s Brunswick Business Park, and began to cautiously indulge in a bit more excess of their own. Ian Hetherington was the only Imagine veteran still standing, but never mind: games on the Mersey had finally come of age thanks to a game from — of all places! — Dundee.

(Sources: the book Grand Thieves and Tomb Raiders: How British Videogames Conquered the World by Rebecca Levene and Magnus Anderson and Sinclair and the “Sunrise” Technology: The Deconstruction of a Myth by Ian Adamson and Richard Kennedy; the 1989 episode of Computer Chronicles entitled “The Commodore Amiga”; The One of June 1989, March 1990, September 1990, and January 1991; .info of November 1990 and February 1991; Amiga Format of May 1993 and July 1995, and the annual for 1992; The Games Machine of June 1989; ACE of April 1991; Amiga World of June 1991; Amazing Computing of April 1991 and March 1992; the online articles “From Lemmings to Wipeout: How Ian Hetherington Incubated Gaming Success” from Polygon, “The Psygnosis Story: John White, Director of Software” from Edge Online, and “An Ode to the Owl: The Inside Story of Psygnosisfrom Push Square; “Playing Catch Up: GTA/Lemmings‘ Dave Jones” from Game Developer; Mike Dailly’s “Complete History of DMA Design.” My thanks also go to Jason Scott for sharing his memories of working at Psygnosis’s American branch office in the early 1990s.

If you haven’t played Lemmings before, you’ve been missing out; this is one game everyone should experience. Feel free to download the original Amiga version from here. Amiga Forever is an excellent, easy-to-use emulation package for running it. For those less concerned about historical purity, there are a number of versions available to play right in your browser.)

 
 

Tags: , , ,

Games on the Mersey, Part 4: The All-Importance of Graphics

The die for the first successful incarnation of Psygnosis was cast in the summer of 1987 with the release of a game called Barbarian. It was actually the company’s fourth game, following Brataccas, that underwhelming fruition of Imagine Software’s old megagame dream, and two other titles which had tried to evoke some of the magic of games from other publishers and rather resoundingly failed: Arena, an unfun alternative to the Epyx Games sports series, and Deep Space, a vaguely Elite-like game of interstellar trading and space combat saddled with a control scheme so terrible that many buyers initially thought it was a bug. None of this trio, needless to say, had done much for Psygnosis’s reputation. But with Barbarian the company’s fortunes finally began to change. It provided them at last with just the formula for commercial success they had been so desperately seeking.

Barbarian

Programmed by the redoubtable Dave Lawson, Barbarian might be labeled an action-adventure if we’re feeling generous, although it offers nothing like the sort of open-ended living world other British developers of the era were creating under that label. It rather takes the form of a linear progression through a series of discrete screens, fighting monsters and dodging traps as the titular barbarian Hegor. The control scheme — for some reason a consistent sore spot in almost every game Lawson programmed — once again annoys more than it ought to, and the game as a whole is certainly no timeless classic. What it did have going for it back in the day, however, were its superb graphics and sound. Released initially only on the Atari ST and the Commodore Amiga, just as the latter especially was about to make major inroads in Britain and Europe thanks to the new Amiga 500 model, it was one of the first games to really show what these 16-bit powerhouses could do in the context of a teenage-boy-friendly action game. Reviewers were so busy gushing about the lengthy opening animation, the “strange-looking animals,” and the “digitised groans and grunts” that accompanied each swing of Hegor’s sword as he butchered them that they barely noticed the game’s more fundamental failings.

Barbarian became the first unadulterated, undeniable hit to be created by the Imagine/Psygnosis nexus since Dave Lawson’s Arcadia had kicked everything off on the Sinclair Spectrum almost five years before. Thus was a precedent set. Out were the old dreams of revolutionizing the substance of gaming via the megagame project; in were simple, often slightly wonky action games that looked absolutely great to the teenage boys who devoured them. If Lawson and Ian Hetherington were disappointed to have abandoned more high-concept fare for simple games with spectacular visuals, they could feel gratified that, after all the years of failure and fiasco as Imagine, Finchspeed, Fireiron, and finally Psygnosis, they were at last consistently making games that made them actual money.

Psygnosis’s first games had been created entirely in-house, with much of the design and coding done by Lawson and Hetherington themselves. In the wake of Barbarian‘s success, however, that approach was changed to prioritize what was really important in them. After the games already in the pipeline at the time of Barbarian‘s release were completed, future programming and design — such as the latter was in the world of Psygnosis — would mostly be outsourced to the hotshot young bedroom coders with which Britain was so amply endowed.

Psygnosis hired far more artists rather than programmers as in-house employees. They built an art team that became the envy of the industry around one Garvan Corbett, a talented illustrator and animator who had come to Psygnosis out of a workfare program in the very early days, before even Brataccas had been released, and who had been responsible for the precedent-setting graphics in Barbarian. Notably, none of Psygnosis’s artists had much prior experience with computers; the company preferred to hire exceptional artists in traditional mediums and teach them what they needed to know to apply their skills to computer games. It gave Psygnosis’s games a look that was, if not quite what one might describe as more mature than the rest of the industry, certainly more striking, more polished. Working with Amigas running the games-industry stalwart Deluxe Paint, Corbett and his colleagues would build on the submissions of the outside teams to bring them in line with Psygnosis’s house style, giving the in-game graphics that final sheen for which the company was so famous whilst also adding the elaborate title screens and opening animations for which they were if anything even more famous. Such a hybrid of in-house and out-of-house development was totally unique in late-1980s game-making, but it suited Psygnosis’s style-over-substance identity perfectly. “At Psygnosis, graphics are all-important,” wrote one journalist as his final takeaway after a visit to the company. Truer words were never written.

With the assistance of an ever-growing number of outside developers, the games poured out of Psygnosis in the years after Barbarian, sporting short, punchy titles that sounded like heavy-metal bands or Arnold Schwarzenegger movies, both of which things served as a profound influence on their young developers: Terrorpods, Obliterator, Menace, Baal, Stryx, Blood Money, Ballistix, Infestation, Anarchy, Nitro, Awesome, Agony. Occasionally Psygnosis would tinker with the formula, as when they released the odd French adventure game Chrono Quest, but mostly it was nothing but relentless action played over a relentlessly thumping soundtrack. An inordinate number of Psygnosis games seemed to feature tentacled aliens that needed to be blown up with various forms of lasers and high explosives. In light of this, even the most loyal Psygnosis fan could be forgiven for finding it a little hard to keep them all straight. Many of the plots and settings of the games would arrive on the scene only after the core gameplay had been completed, when Psygnosis’s stable of artists were set loose to wrap the skeletons submitted by the outside developers in all the surrealistic gore and glory they could muster. Such a development methodology couldn’t help but lend the catalog as a whole a certain generic quality. Yet it did very, very well for the company, as the sheer number of games they were soon churning out — nearly one new game every other month by 1989 — will attest.

Psygnosis’s favored machine during this era was the Amiga, where their aesthetic maximalism could be deployed to best effect. They became known among owners of Amigas and those who wished they were as the platform’s signature European publisher, the place to go for the most impressive Amiga audiovisuals of all. This was the same space occupied by Cinemaware among the North American publishers. It thus makes for an interesting exercise to compare and contrast the two companies’ approaches.

In the context of the broader culture, few would have accused Cinemaware’s Bob Jacob, a passionate fan of vintage B-movies, of having overly sophisticated tastes. Yet, and often problematic though they admittedly were in gameplay terms, Cinemaware’s games stand out next to those of Psygnosis for the way they use the audiovisual capabilities of the Amiga in the service of a considered aesthetic, whether they happen to be harking back to the Robin Hood of Errol Flynn in Defender of the Crown or the vintage Three Stooges shorts in the game of the same name. There was a coherent and unique-to-it sense of aesthetic unity behind each one of Cinemaware’s games, as indicated by the oft-mocked title Jacob created for the person tasked with bringing it all together: the “computographer,” who apparently replaced the cinematographer of a movie.

Psygnosis games, in contrast, had an aesthetic that could be summed up in the single word “more”: more explosions, more aliens, more sprites flying around, more colors. This was aesthetic maximalism at its most maximalist, where the impressiveness of the effect itself was its own justification. Psygnosis’s product-development manager John White said that “half the battle is won if the visuals are interesting.” In fact, he was being overly conservative in making that statement; for Psygnosis, making the graphics good was actually far more than half the battle that went into making a game. Ian Hetherington:

We always start with a technical quest — achieving something new with graphics. We have to satisfy ourselves that what we are trying to achieve is possible before we go ahead with a game. My worst moments are when I show innovative techniques to the Japanese, and all they want to know is, what is the plot. They don’t understand our way of going about things.

The Japanese approach, as practiced by designers like the legendary Shigeru Miyamoto, would lead to heaps of Nintendo Entertainment System games that remain as playable today as they were in their heyday. The Psygnosis approach… not so much. In fact, Psygnosis’s games have aged almost uniquely poorly among their peers. While we can still detect and appreciate the “computography” of a Cinemaware interactive movie, Psygnosis games hit us only with heaps of impressive audiovisual tricks that no longer impress. Their enormous pixels and limited color palettes — yes, even on an audiovisual powerhouse of the era like the Amiga — now make them look quaint rather than awe-inspiring. Their only hope to move us thus becomes their core gameplay — and gameplay wasn’t one of Psygnosis’s strengths. Tellingly, most Psygnosis games don’t sport credited designers at all, merely programmers and artists who cobbled together the gameplay in between implementing the special effects. An understanding of what people saw in all these interchangeable games with the generic teenage-cool titles therefore requires a real effort of imagination from anyone who wasn’t there during the games’ prime.

The success of Psygnosis’s games was inextricably bound up with the platform patriotism that was so huge a part of the computing scene of the 1980s. What with the adolescent tendency to elevate consumer lifestyle choices to the status of religion, the type of computer a kid had in his bedroom was as important to his identity as the bands he liked, the types of sports cars he favored, or the high-school social set he hung out with — or possibly all three combined. Where adults saw just another branded piece of consumer electronics, he saw a big chunk of his self-image. It was deeply, personally important to him to validate his choice by showing off his computer to best effect, preferably by making it do things of which no other computer on the market was capable. For Amiga owners in particular, the games of Psygnosis fulfilled this function better than those of any other publisher. You didn’t so much buy a Psygnosis game to play it as you did to look at it, and to throw it in the face of any of your mates who might dare to question the superiority of your Amiga. A Psygnosis game was graphics porn of the highest order.

But Psygnosis’s graphics-über-alles approach carried with it more dangers than just that of making games whose appeal would be a little incomprehensible to future generations. Nowhere was the platform patriotism that they traded on more endemic than in the so-called “scene” of software piracy, whose members had truly made the computers they chose to use the very center of their existence. And few games were more naturally tempting targets for these pirates than those of Psygnosis. After all, what you really wanted out of a Psygnosis game was just a good look at the graphics. Why pay for that quick look-see when you could copy the disk for free? Indeed, cracked versions of the games were actually more appealing than the originals in a way, for the cracking groups who stripped off the copy protection also got into the habit of adding options for unlimited lives and other “trainers.” By utilizing them, you could see everything a Psygnosis game had to offer in short order, without having to wrestle with the wonky gameplay at all.

It was partially to combat piracy that Psygnosis endeavored to make the external parts of their games’ presentations as spectacular — and as appealing to teenage sensibilities — as the graphics in the games themselves. Most of their games were sold in bloated oblong boxes easily twice the size of the typical British game box — rather ironically so, given that few games had less need than those of Psygnosis for all that space; it wasn’t as if there was a burning need for lengthy manuals or detailed background information to accompany such simple, generic shoot-em-ups. Virtually all of Psygnosis’s covers during this era were painted by Roger Dean, the well-known pop artist with whom Ian Hetherington and Dave Lawson had first established a relationship back in the Imagine days. “If you’re asking so much for a piece of software,” said Hetherington, “you can’t package it like a Brillo pad.”

Dean’s packaging artwork was certainly striking, if, like most things associated with Psygnosis during this period, a little one-note. Not a gamer himself, Dean had little real concept of the games he was assigned, for which he generally drew the cover art well before they had been completed anyway. He usually had little more to go on than the name of the game and whatever said name might happen to suggest to his own subconscious. The results were predictable; the way that Psygnosis’s covers never seemed to have anything to do with the games inside the boxes became a running joke. No matter: they looked great in their own right, just begging to be hung up in a teenager’s bedroom. In that spirit, Psygnosis took to including the cover art in poster or tee-shirt form inside many of those cavernous boxes of theirs. Whatever else one could say about them, they knew their customers.

It’s difficult to exaggerate the role that image played in every aspect of Psygnosis’s business. Some of the people who wound up making games for them, who were almost universally in the same demographic group as those who played their games, were first drawn to this particular publisher by the box art. “I was always a fan of the art style and the packaging,” remembers outside developer Martin Edmondson. “Against a sea of brightly-colored and cheap-looking game boxes the Psygnosis products stood out a mile and had an air of mystery — and quality — about them.” Richard Browne, who became a project manager at Psygnosis in early 1991, noted that “while many of the games they produced were renowned somewhat for style over substance, they just oozed quality and the presentation was just sheer class.”

The quintessential Psygnosis game of the period between 1987 and 1990 is undoubtedly the 1989 release Shadow of the Beast. The company’s most massively hyped title since the days of Brataccas and the Imagine megagames, it differed from them by rewarding its hypers with sales to match. It was the creation of a three-man development studio who called themselves Reflections, who had already created an earlier hit for Psygnosis in the form of Ballistix. Unusually among Psygnosis’s outside developers, Reflections created all of their own graphics, outsourcing only the music and of course the Roger Dean cover art. Martin Edmondson, the leader of the Reflections trio, makes no bones about their priorities when creating Shadow of the Beast:

For me, it was about the technical challenge of getting something working that was far beyond the theoretical limits of the machine. It wasn’t a story originally. It was a technical demonstration of what the machine could do — I suppose sort of a “look how clever we are” type of thing. So, yeah… less interested in the subtleties of game design and game process, more “let’s see if we can do what these big expensive arcade machines are doing on our small home computer.”

Edmondson describes the book that most inspired Shadow of the Beast not as any work of fiction but rather the Amiga Technical Reference Manual. Every element of the game was crafted with the intention of pushing the hardware described therein to heights that had only been hinted at even in Psygnosis’s own earlier works. Barely two years after Shadow of the Beast‘s release, Edmondson’s assessment of its design — or perhaps lack thereof — was unflinching: “We couldn’t get away with it now. It was definitely a case of being in the right place at the right time. Apart from how many colors and layers of parallax and monsters we could squeeze on screen, no thought went into it whatsoever.” Its general approach is similar to that of the earlier Barbarian, but it’s a much more constrained experience than even that game had been. As the landscape scrolls behind your running avatar, you have to execute a series of rote moves with pinpoint timing to avoid seeing him killed. It’s brutally difficult, and difficult in ways that really aren’t much fun at all.

But, as Edmondson himself stated, to complain overmuch about the gameplay of Shadow of the Beast is to rather miss the point of the game. While plenty of praise would be given to the atmospheric soundtrack Psygnosis commissioned from veteran game composer David Whittaker and to the unprecedentedly huge sprites programmed by Reflections, the most attention of all would be paid to the thirteen layers of parallax scrolling that accompany much of the action.

Parallax scrolling is a fancy phrase used to describe the simulation of a real-world property so ingrained in us that we rarely even realize it exists — until, that is, we see a videogame that doesn’t implement it. Imagine you’re standing at the edge of a busy highway on a flat plain, with a second highway also in view beyond this one, perhaps half a kilometer in the distance. Cars on the highway immediately before you whiz by very quickly, perhaps almost too quickly to track with the eyes. Those on the distant highway, however, appear to move through your field of view relatively slowly, even though they’re traveling at roughly the same absolute speed as those closer to you. This difference is known as the parallax effect.

Because the real world we live in is an analog one, the parallax effect here has infinite degrees of subtle shading. But videogames which implemented parallax in the 1980s usually did so on only one or two rigid levels, resulting in scrolling landscapes that, while they may have looked better than those showing no parallax effect at all, nevertheless had their own artificial quality. Shadow of the Beast, however, uses its thirteen separate parallax layers to approach the realm of the analog, producing an effect that feels startingly real in contrast to any of its peers. As you watch YouTube creator Phoenix Risen’s playthrough of some of the trained version of Shadow of the Beast below — playing a non-trained version of this game is an exercise for masochists only —  be sure to take note of the scrolling effects; stunning though they were in their day, they’re like much else that used to impress in vintage Psygnosis games in that they’re all too easy to overlook entirely today.


Whether such inside baseball as the numbers of layers of parallax scrolling really ought to be the bedrock of a game’s reputation is perhaps debatable, but such was the nature of the contemporary gaming beast upon which Shadow of the Beast so masterfully capitalized. Edmonson has admitted that implementing the thirteen-layer scheme consumed so much of the Amiga’s considerable power that there was very little left over to implement an interesting game even had he and his fellow developers been more motivated to do so.

Psygnosis sold this glorified tech demo for fully £35, moving into the territory Imagine had proposed to occupy with the megagames back in the day. This time, though, they had a formula for success at that extreme price point. “Shadow of the Beast just went to show that you don’t need quality gameplay to sell a piece of software,” wrote a snarky reviewer from the magazine The One. Zzap! described it only slightly more generously, as “very nice to look at, very tough to play, and very expensive.” Whatever its gameplay shortcomings, Shadow of the Beast became the most iconic Amiga game since Cinemaware’s Defender of the Crown, the ultimate argument to lay before your Atari ST-owning schoolmate. “Within a week or so of launch they could barely press enough disks to keep up with demand,” remembers Edmondson. For the Imagine veterans who had stayed the course at Psygnosis, it had to feel like the sweetest of vindications.

One can’t help but admire Psygnosis’s heretofore unimagined (ha!) ability to change. They had managed to execute a complete about-face, shedding the old Imagine legacy of incompetence and corruption. In addition to being pretty poor at making games people actually wanted to play, Imagine had been staggeringly, comprehensively bad at all of the most fundamental aspects of running a business, whilst also being, to put it as gently as possible, rather ethically challenged to boot. They had had little beyond audacity going for them. Few would have bet that Psygnosis, with two of their three leaders the very same individuals who had been responsible for the Imagine debacle, would have turned out any different. And yet, here they were.

It’s important to note that the transition from Imagine to Psygnosis encompassed much more than just hitting on a winning commercial formula. Ian Hetherington and Dave Lawson, with the aid of newcomer Jonathan Ellis, rather changed their whole approach to doing business, creating a sustainable company this time around that conducted itself with a measure of propriety and honesty. Whereas Imagine had enraged and betrayed everyone with whom they ever signed a contract, Psygnosis developed a reputation — whatever you thought of their actual games — as a solid partner, employer, and publisher. There would never be anything like the external scandals and internal backstabbing that had marked Imagine’s short, controversial life. Indeed, none of the many outside developers with whom Psygnosis did deals ever seemed to have a bad word to say about them. Hetherington’s new philosophy was to “back the talent, not the product.” Said talent was consistently supported, consistently treated fairly and even generously. Outside developers felt truly valued, and they reciprocated with fierce loyalty.

As people interested in the processes of history, we naturally want to understand to what we can attribute this transformation. Yet that question is ironically made more difficult to answer by another aspect of said transformation: rather than making a constant spectacle of themselves as had Imagine, Psygnosis became a low-key, tight-lipped organization when it came to their personalities and their internal politics, preferring to let their games, their advertising, and all those gorgeous Roger Dean cover paintings speak for them. Neither Hetherington, Lawson, nor Ellis spoke publicly on a frequent basis, and full-on interviews with them were virtually nonexistent in the popular or the trade press.

That said, we can hazard a few speculations about how this unlikely transformation came to be. The obvious new variable in the equation is Jonathan Ellis, a quietly competent businessman of exactly the type Imagine had always so conspicuously lacked; his steady hand at the wheel must have made a huge difference to the new company. If we’re feeling kind, we might also offer some praise to Ian Hetherington, who showed every sign of having learned from his earlier mistakes — a far less common human trait than one might wish it was — and having revamped his approach to match his hard-won wisdom.

If we’re feeling a little less kind, we might note that Dave Lawson’s influence at Psygnosis steadily waned after Brataccas was completed, with Hetherington and Ellis coming to the fore as the real leaders of the company. He left entirely in 1989, giving as his public reason his dissatisfaction with Psygnosis’s new focus on outside rather than in-house development. He went on to form a short-lived development studio of his own called Kinetica, who released just one unsuccessful game before disbanding. And after that, Dave Lawson left the industry for good, to join his old Imagine colleague Mark Butler in modern obscurity.

A rare shot of Ian Hetherington, left, with his business partner Jonathan Ellis, right, in 1991. (In the middle is Gary Bracey of Imagine’s old Commercial Breaks companion Ocean Software.) In contrast to the look-at-me! antics of the Imagine days, Hetherington and Ellis were always among the most elusive of games-industry executives, preferring to let the products — and their superb marketing department — speak for themselves.

None of this is to say that there were no traces of the Imagine that had been to be found in the Psygnosis that was by 1989. On the contrary: for all the changes Hetherington and Ellis wrought, the new Psygnosis still evinced much of the Imagine DNA. To see it, one needed look no further than the Roger Dean cover art, a direct legacy of Imagine’s megagame dream. Indeed, the general focus on hype and style was as pronounced as ever at Psygnosis, albeit executed in a far more sophisticated, ethical, and sustainable fashion. One might even say that Hetherington and Ellis built Psygnosis by discarding all the scandalous aspects of Imagine while retaining and building upon the visionary aspects. Sometimes the legacy could be subtle rather than obvious. For example, the foreign distribution network that had been set up by Bruce Everiss to fuel Imagine’s first rush of success was also a key part of the far more long-lived success enjoyed by Psygnosis. Hetherington and Ellis never forgot Everiss’s lesson that having better distribution than the competition — having more potential customers to sell to — could make all the difference. As home computers spread like wildfire across Western Europe during the latter half of the 1980s, Psygnosis games were there early and in quantity to reap the benefits. In 1989, Jonathan Ellis estimated that France and West Germany alone made up 60 percent of Psygnosis’s sales.

Psygnosis knew that many of their potential customers, particularly in less well-off countries, weren’t lucky enough to own Amigas. Thus, while Psygnosis games were almost always born on Amigas, they were ported widely thereafter. Like those of Cinemaware, Psygnosis games gained a cachet merely from being associated with the Amiga, the computer everyone recognized as the premiere game machine of the time — even if some were, for reasons of that afore-described platform patriotism, reluctant to acknowledge the fact out loud. Even the most audiovisually spectacular Psygnosis experiences, like Shadow of the Beast, were duly ported to the humble likes of the Commodore 64 and Sinclair Spectrum, where they paled in comparison to their Amiga antecedents but sold well anyway on the strength of the association. This determination to meet the mass market wherever it lived also smacked of Imagine — albeit, yet again, being far more competently executed.

The South Harrington Building, Liverpool, home of Psygnosis for several years from 1989.

In addition to cutting ties with Dave Lawson in 1989, Hetherington and Ellis also shed original investor and Liverpool big wheel Roger Talbot Smith that year, convincing him to give up his share of the company in return for royalty payments on all Psygnosis sales over the next several years. Yet, and even as their network of outside developers and contractors spread across Britain and beyond, Psygnosis’s roots remained firmly planted in Liverpudlian soil. They moved out of their dingy offices behind Smith’s steel foundry and into a stylish gentrified shipping warehouse known as the South Harrington Building. It lay just one dock over from the city’s new Beatles museum, a fact that must have delighted any old Imagine stalwarts still hanging about the place. While an American journalist visiting Psygnosis in early 1991 could still pronounce Liverpool as a whole to be “very grim,” they were certainly doing their bit to change the economic picture.

Meanwhile Ian Hetherington in particular was developing a vision for Psygnosis’s future — indeed, for the direction that games in general must go. Like many of his counterparts in the United States, he saw the CD-ROM train coming down the track early. “The technological jump is exponential,” he said, “which means you have to make the jump now — otherwise when CD happens you’re going to be ten years behind, not two, and you’re never going to catch up.” With the move to the larger South Harrington Building offices, he set up an in-house team to research CD-ROM applications and develop techniques that could be utilized when the time was right by Psygnosis’s collection of loyal outside developers.

In this interest in CD-ROM — still a rare preoccupation among British publishers, where consumer-computing technology lagged two or three years behind the United States — Psygnosis once again invited comparison with that most obvious point of comparison, the American publisher Cinemaware. Yet there were important differences between the two companies on this front as well. While Cinemaware invested millions into incorporating real-world video footage into their games, Hetherington rejected the fad of “interactive video” which was all the rage in the United States at the time. His point of view reads as particularly surprising given that so many interactive-video zealots of the immediate future would be accused of making pretty but empty games — exactly what Psygnosis was so often accused of in the here and now. It serves perhaps as further evidence of Hetherington’s ability to learn and to evolve. Hetherington:

Interactive video is a farce. It is ill-conceived and it doesn’t work. It is seductive, though. Trying to interact with £400,000 worth of video on disc is a complete fiasco. We are looking for alternative uses of CD. You have to throw your existing thinking in the bin, then go sit in the middle of a field and start again from scratch. Most developers will evolve into CD. They will go from 5 MB products to 10 MB products with studio-quality soundtracks, and that will be what characterizes CD products for the next few years.

Hetherington also differed from Cinemaware head Bob Jacob in making sure his push into CD-ROM was, in keeping with the new operational philosophy behind Psygnosis in general, a sustainable one. Rather than betting the farm like his American counterpart and losing his company when the technology didn’t mature as quickly as anticipated, Hetherington used the ongoing sales from Psygnosis’s existing premium, high-profit-margin games — for all their audiovisual slickness, these simple action games really didn’t cost all that much to make — to fund steady, ongoing work in figuring out what CD-ROM would be good for, via an in-house team known as the Advanced Technology Group. When the advanced technology was finally ready, Psygnosis would be ready as well. At risk of belaboring the point, I will just note one last time how far this steady, methodical, reasoned approach to gaming’s future was from the pie-in-the-sky dreaming of the Imagine megagames.

The Psygnosis Advanced Technology Group in 1991. Standing second from left in the back row is John Gibson — the media’s favorite “Granddad” himself — who after spending some years with Denton Designs wound up rejoining some other old Imagine mates at Psygnosis.

As the 1990s began, then, Psygnosis had a lot going for them, both in the Amiga-dominated present and in the anticipated CD-ROM-dominated future. One thing they still lacked, though — one thing that Imagine also had resoundingly failed to produce — was a single game to their name that could be unequivocally called a classic. It was perhaps inevitable that one of the company’s stable of outside developers, nurtured and supported as they were, must eventually break away from the established Psygnosis gameplay formulas and deliver said classic. Still, such things are always a surprise when they happen. And certainly when Psygnosis did finally get their classic, the magnitude of its success would come as a shock even to them.

(Sources: the book Grand Thieves and Tomb Raiders: How British Videogames Conquered the World by Rebecca Levene and Magnus Anderson; the extra interviews that accompanied the documentary film From Bedrooms to Billions; Retro Gamer 50; Computer and Video Games of October 1987; STart of June 1990; The One of July 1989, March 1990, September 1990 and February 1992; The Game Machine of October 1989 and August 1990; Transactor of August 1989; Next Generation of November 1995; the online articles “From Lemmings to Wipeout: How Ian Hetherington Incubated Gaming Success” from Polygon, “The Psygnosis Story: John White, Director of Software” from Edge Online, and “An Ode to the Owl: The Inside Story of Psygnosisfrom Push Square.)

 
18 Comments

Posted by on September 29, 2017 in Digital Antiquaria, Interactive Fiction

 

Tags: ,