Tag Archives: apple ii



The story of Shadowkeep, even more so than Amazon the odd duck in the Telarium lineup, begins with Sigma Distributing, one of the first big microcomputer hardware distributors in the Seattle area. In 1981 Christopher Anson, a Sigma vice president, sought and received permission to start a new subsidiary to develop original games to serve the growing demand for software for the computers Sigma was selling. Anson’s first two acts were to name the company Ultrasoft and hire a programmer named Alan Clark away from Boeing. Clark became the technical architect of the set of tools and approaches that would define Ultrasoft during their brief existence.

Anson had decided that the best place for Ultrasoft to make a splash was in the field of illustrated adventure games, a nexus of excitement in the wake of Mystery House and The Wizard and the Princess. Like Scott Adams, Ken Williams, and Marc Blank before him, Clark realized that it would be more efficient in the long run to write an adventure-game engine and language that could give designers a bit of distance from the technical details of implementation as well as let Ultrasoft deploy their games to multiple platforms relatively painlessly. Deciding that a little corporate branding is always in order, they named their adventure programming language simply Ultra; the interpreter for each targeted platform UltraCode; their graphics system UltraVision. From the standpoint of the end user, Ultrasoft’s most obvious innovation — or, if you like, gimmick — involved this last. UltraVision could display not just static pictures but also brief animations, which could be used to, say, show the player’s avatar actually walking from room to room. Less obvious but no less significant, however, was the parser, one of the first developed outside of Infocom that allowed more than two words — although, it should be said, it was nowhere near as impressive a creation overall (more on that shortly).

Ultrasoft developed two adventures using the system, The Mask of the Sun (1982) and Serpent’s Star (1983). Both are interesting in their way, more carefully crafted, atmospheric, and thoughtful than was the norm of the time. Serpent’s Star in particular does a surprisingly good job of matching its puzzles to its theme of Buddhist philosophy. But both — and particularly Mask of the Sun — are also riddled with the sorts of unfair elements that were all too typical of their era. And both are fairly excruciating to play under any conditions. Whatever its other merits, you see, the Ultra system is slow. A quick look at the games’ technical underpinnings gives a clue as to why: the Ultra program logic doesn’t appear to be compiled at all, merely interpreted in place. The only blessing of that approach was that it enabled some frustrated adventurers to find the solutions to the more incomprehensible puzzles by code diving.

Ultrasoft first tried to market and distribute Mask of the Sun and Serpent’s Star on their own, but found it tough sledding for a tiny company with mainly regional connections in the professionalizing software industry of 1983. They soon accepted the role of developer only, licensing both games to Brøderbund for publication. After spending most of 1983 working on an ambitious new game, an adventure-game/CRPG hybrid called Shadowkeep written in a new version of their system which they dubbed Ultra II, they found a publisher for it in Spinnaker, who took the largely completed game as a future member of their planned Telarium line.

Looking to find some way to make the game fit with the bookware theme of the line as a whole, Spinnaker came up with the idea of reversing the usual process, of hiring a name writer to adapt the game into a book rather than the opposite. Luckily, they had substantial time to get the novelization done; they signed the contract with Ultrasoft in late 1983, a year before they planned to launch the Telarium line. Spinnaker approached Warner Books, who hooked them up with the reigning king of media tie-in novels, Alan Dean Foster. After making his reputation within the industry ghost-writing the Star Wars novelization for George Lucas, Foster had gone on to do The Black Hole, Clash of the Titans, The Last Starfighter, and Alien among many others. Virtually any big science-fiction or fantasy movie released seemed to arrive with an accompanying Alan Dean Foster novelization. (That’s still true today; his most recent novelization as of this writing is of Star Trek into Darkness.)

Spinnaker furnished Foster with design documents and a copy of the Shadowkeep source code and let him have at it. Those were all he had to go on; he doesn’t recall ever meeting or speaking with anyone from the design team, nor ever actually playing the game. He does, however, recall it as a very challenging project indeed. Being a party-based CRPG in the mold of Wizardry, Shadowkeep has no actual protagonist to speak of — no characters at all, really, outside of the fellow who sells you equipment and the evil demon Dal’Brad who shows up for the final showdown on the last of the dungeon levels. He was thus forced to invent the vast majority of the novel himself, whilst struggling to strike a balance between writing some recognizable analogue to the experience of the game and giving away all of its challenges. He did his usual workmanlike job, handing in a readable little genre exercise for simultaneous release with the game. Tellingly, it’s not until halfway through the book that the heroes enter the Shadowkeep — i.e., reach the beginning of the game.


Said game is… well, it’s really strange. Imagine Wizardry played with a text parser, and you’ve pretty well summed up Shadowkeep. You make a party of up to nine(!) characters. Most of the usual is here: attribute scores, classes and races to choose from, ever better equipment and spells to collect. Oddly missing, however, are character levels and any concept of experience; getting more powerful in this game is strictly a matter of finding or buying better stuff. The dungeon levels are the usual 16 X 16 grids full of traps, monsters, and assorted cartographic challenges. There are some original ideas here. For instance, the positions of the monsters that attack you and those of the members of your party are taken account of to a degree not found in Wizardry, adding some strategic depth to the experience. You likewise have more combat options than in Wizardry; in each round you can choose to forget defense and attack twice, or to just parry, or to attack once while not totally neglecting defense. And certainly the full-color graphics, which feature occasional examples of Ultrasoft’s trademark animations, are much better than Wizardry‘s wire frames.

Shadowkeep Shadowkeep

Still, Shadowkeep mostly just makes you appreciate all the more how well Wizardry does the dungeon crawl. The game replaces Wizardry‘s hot-key interface with, yes, a text-adventure parser. You literally just type what you want to do: “OPEN DOOR,” “GET THE TORCH,” “CAST THE LUMINANCE SPELL,” “LIGHT THE TORCH AND PREPARE THE SWORD,” “PUT THE WAND OF TRAVEL IN THE CHEST.” Sounds fine, right? Well, what sounds fine in the abstract doesn’t work so well in practice. You must now type “F <RETURN>” (for “FORWARD”) instead of just “F” every time you want to walk forward a square in the dungeon. This may seem a minor thing, but consider that you’ll be entering this command thousands and thousands of times in the course of playing the game. That extra keystroke thus means thousands and thousands of extra keystrokes. And that’s the tip of the iceberg; this game is death by a hundred such small cuts. Commands by default are carried out by the leader of your party, who is not even a character you select but merely the one with the highest Leadership attribute score. Having someone else do something requires that you prepend her name to the command (“NAOMI GET THE TORCH AND GIVE IT TO REB”) — yet more tedious typing.

And the parser, that focal point of the whole interface, is at least as exasperating as the mainline Telarium parser. Like Byron Preiss Video Productions and many others at this time, Ultrasoft chose to take a profoundly misguided approach to this most critical piece of their engine. As described in an article in Softline magazine:

Ultrasoft’s parser is based on concepts in artificial intelligence. In any given message, it eliminates words that don’t make sense and attempts to make sense out of words that are relevant to the situation. This method frees the player from the verb-noun format of the typical adventure’s input.

In other words, the parser pretends to be smarter than it is by simply throwing out anything it doesn’t understand and doing what it can with the rest. This approach may “free the player from the verb-noun format,” but it also guarantees that complex (and often not so complex) inputs will be not just rejected — which combined with a proper error message is at least a form of useful feedback — but misunderstood. Far from making the parser seem smarter, this just makes it seem that much dumber and that much more infuriating. It leads to situations like that in the Byron Preiss games where any input containing the word “LOOK” anywhere within it causes the parser to dump everything else and print a room description. In Shadowkeep, typing “NAOMI CAST CURE SPELL ON REB” leads her to cast it away into the ether — that “ON REB” was a bridge too far, and thus ignored. Such a system fails to recognize that at least 95% of the time those extra words are not just stuff the player tacked on for the hell of it (who wants to type more than necessary under any circumstances?) but essential information about what she really wants to do.

To play Shadowkeep is to constantly wrestle with the interface. After playing several hours there are basic tasks I still haven’t figured out how to do — like how to cast a cure spell on someone outside of combat, or how to just get a list of the spells a certain character knows. And, like Ultrasoft’s earlier games, Shadowkeep is slow. Every step in the dungeon seems to take an eternity, and as for more complex action… forget about it. Playing is like wading through molasses with shackled feet.

The rewards for all the parsing pain are relatively slight: a handful of logic- or object-oriented puzzles on each level that can perhaps be a bit more complex than they could be under the Wizardry engine. Needless to say, they aren’t worth the rest of the trouble, making Shadowkeep something of a lowlight in the long, chequered history of adventure/CRPG hybrids. Which is a shame, because Shadowkeep‘s dungeon levels do show evidence of some careful craftsmanship and, as noted above, there are some good, original ideas on display here. Shadowkeep is a perfect example of a potentially worthy game destroyed by horrid interface choices. And I mean that literally: if the game isn’t outright unplayable (some patient souls have apparently played and even enjoyed it), it’s closer than I ever need to come to that adjective.

Ultrasoft was already in the process of fading quietly away by the time of Shadowkeep‘s late 1984 release. They never managed to port the Ultra II engine beyond the Apple II, leaving Shadowkeep without that all-critical Commodore 64 version. Spinnaker toyed with doing the port themselves, even announcing it as coming soon on various occasions, but I see no reason to believe that ever happened. (A Commodore 64 version has been a semi-mythical White Whale in collecting circles for many years now, but, despite some anecdotal claims and remembrances, no one has ever produced an actual working version to my knowledge.) The lack of a Commodore 64 version and the underwhelming nature of the game itself combined to make Shadowkeep the least successful — and, today, rarest — of all the Telarium games. Alan Dean Foster’s book, while no bestseller itself, appears to have sold far more copies on the author’s name recognition and its $3 (as opposed to $35) price tag.

Shadowkeep consists, like most of the Telarium games, of four disk sides. In this case, however, all four sides are written to during play to preserve the current state of the dungeon levels; the player is expected to copy her originals before beginning. To my knowledge none of the copies floating around the Internet are pristine. All contain the residue of previous players in their dungeons. I hope that at some point some enterprising collector will take the time to scan and archive an original set of disks. In the meantime I’m posting for download here the cleanest set I can find, whose first player didn’t seem to get much if at all beyond the first dungeon level. If whilst playing Wizardry or Bard’s Tale you thought to yourself that this game would be even better if it played a lot slower and had a parser, you’ve just found your dream CRPG. All others should consider this one a subject for historical research only.

And on that less than stellar note we’ll be moving on from Telarium for a while. My final reckoning of their first five releases is: two worthy efforts (Dragonworld and Amazon); one could-have-been-a-contender (Fahrenheit 451); and two total misfires (Rendezvous with Rama and Shadowkeep). Not a horrible track record on the whole. We’ll see if they learned any lessons in time for their last few games down the road a ways. But next it’s time to get back to the big boys in the field, and tell the rest of the story of Infocom’s very eventful 1984.

(In addition to the sources listed in my first article on bookware and Telarium, I also referenced for this article a feature on Ultrasoft in the May/June 1983 Softline. And thanks to Alan Dean Foster for taking the time to share his memories of the Shadowkeep project with me.)


Tags: , , ,

Amazon in Pictures


I thought we’d look at Amazon a little differently from the other Telarium games because it is, even more so than the others, very much a visual as well as textual experience. I therefore thought I could best convey the experience of playing it with lots and lots of pictures. It also marks one of the last of the classic Apple II “hi-res adventures,” which whatever their other failings had a unique aesthetic of their own. With the Commodore 64 so eclipsing other gaming platforms by 1984 — remember, Amazon with its long gestation period is in a sense much older than its eventual publication date of late that year — we won’t be seeing a whole more of this look. So, let this be our goodbye to one era even as it also represents a prime example of the newer, more sophisticated era of bookware. And anyway things have been kind of dry around here visually for a while now. This blog could use some pictures!

Uniquely for the Telarium line, Amazon lets you choose one of three difficulty levels. I’m playing on the highest difficulty of “Expedition Leader” here, which gives the most to see but is also pretty brutal; death lurks literally everywhere, and comes often (usually?) with no warning whatsoever.


Shay Addams referred to Amazon as an “interactive movie” in his Questbusters review, one of the earlier applications of that term to a computer game. And indeed, the opening sequence is very cinematic, and suitably dramatic. After tuning a receiver to catch the satellite transmission, we watch as the camera pans around the smoking, demolished remnants of the previous Amazon expedition’s campground. It ends with a shot of a member of the cannibal tribe that replaces the killer gorillas of the novel as the architects of all this destruction.


Replacing Amy the signing gorilla is Paco the talking parrot, shown here in this lovely illustration by David Durand. I find him kind of hilarious, but I’m not entirely sure if he was written that way intentionally or not; Crichton, whatever his other strengths, isn’t normally what you’d call a funny writer. Paco at first appears to be a classic adventure-game sidekick/hint system, giving advice constantly throughout the game. In a departure from the norm, however, his advice is, at least on Expedition Leader level, disastrously misguided at least 50% of the time, getting you killed or stranded in all sorts of creative ways. Crichton often stated that he wanted to make a more believable, realistic adventure game. In that spirit, I suppose taking everything said by a talking parrot as gospel might not get you very far in the real world. But then if we’re debating realism we have to also recognize that Paco is basically a cartoon character, even more so than Amy the ridiculously intelligent, loyal, and empathetic gorilla of the novel. Foghorn Leghorn’s got nothing on this guy.


Like in the book, we can use our field computer to link up with NSRT headquarters for regular updates. The above shows the situation just after we’ve parachuted with Paco into the Amazon rain forest. Looks like other than the cannibals and the rampaging government troops and that volcano that’s about to erupt there’s nothing to worry about.

Amazon Amazon


Have I mentioned that it’s easy to die at Expedition Leader level? One wrong move leads to one of a rogue’s gallery of gleefully described death scenes to rival one of the Phoenix games.


Crichton’s opinion of Peru’s military seems to be no higher than was his opinion of Zaire’s in the novel.


In another strikingly cinematic scene, we use our handy night-vision goggles and an assist from Paco to sneak away from the troops who captured us.


With the “corrupt government troops” behind us, we now get to deal with the Kemani tribesman. Luckily, they like cigarettes and we happen to have a pack.


We climb the volcanic Mount Macuma, which separates us from our objective and will soon give us problems in another way.


NSRT airlifts some desperately needed supplies to us. (Why do I want to hear Paco saying “De plane! De plane!” when I see this screenshot?)

Amazon Amazon

Crichton may have been trying to make a new type of adventure game, but he couldn’t resist including a very old-school maze which we have to navigate to reach the airdropped supplies. This is actually the only part of the game which requires mapping. Normally it’s much more interested in forward plot momentum than the details of geography.

Amazon Amazon

Getting across the river is even more difficult than was getting over the mountain. Once again our night-vision goggles come in handy.

Amazon Amazon

Next morning we find that mischivious monkeys have stolen our supplies. A merry chase follows, implemented as one of Amazon‘s two action games. These were not likely to make arcade owners nervous, but at least they aren’t embarrassingly bad like the action games in Telarium’s other titles. They’re actually kind of engaging in their way; a nice change of pace. Indeed, Amazon‘s way of constantly throwing different stuff at you is one of the most impressive things about it. The screen is constantly changing. “Whatever works for this part of the story” seems to have been Crichton’s philosophy.

Amazon Amazon

One more obstacle to cross, and we come to the outskirts of the lost city of Chak. It doesn’t exactly look welcoming.


The cannibals attack that night and, in increasing numbers, every night we remain in Chak. The attack is presented as another action game, this time a Space Invaders-like affair which, while not as original or entertaining as the monkey chase, is at least competently executed.

Amazon Amazon

We have about five days in Chak before the volcano erupts. If that sounds generous, know that it’s really not; time passes devilishly quickly. Our main objective is to find the secret staircase that will take us to the endgame.

Amazon Amazon

The endgame requires us to open a series of doors in the correct order using clues found onscreen — one of the few classically adventure-gamey puzzles you’ll find in Amazon. The correct sequence for the above, for example, is 1-3-2. I assume this is because there are 9 marks on the first door, 13 on the second, and 11 on the third. An any rate my first instinct was to arrange them in numerical sequence, and it worked.


The final sequence is, to say the least, a bit more tricky. Now we have nine doors to contend with. This puzzle, which appears only at Expedition Leader level, stumped me entirely and forced me to a walkthrough. If you can solve it, or even just give the methodology for solving it given the correct answer, I’d love to hear about it. To see the answer, highlight the empty space that follows: 3-4-8-1-5-9-2-6-7.


Get past the last of the doors and we come to enough emeralds to warm any greedy adventurer’s heart. And after that, to quote Neal Stephenson, “It’s just a chase scene,” as we rush to get away from the erupting volcano.

Crichton wouldn’t return to computer games until some fifteen years after Amazon. It’s not hard to understand why. Even if Amazon sold 100,000 copies, his earning from it would have been a drop in the bucket compared to what he earned from his books and movie licenses. Yet Amazon is good enough that it makes me wish he had done more work in interactive mediums.

Which is not to say that it doesn’t have its problems. The parser is no better than you might expect from such a one-off effort; on at least one or two occasions I knew exactly what to do but had to turn to the walkthrough to figure out how to say it to the game. And the story logic often has little to do with real-world logic. If you don’t do everything just right in the opening stages of the game, for instance, your flight to Peru will get hijacked and you’ll end up dead after the game toys with you a bit — this despite there being no logical reason why your previous failings should have led to your flight getting hijacked.

Still, Amazon is a unique experience, as I hope the pictures above convey. Especially if played on one of the less masochistic levels, it’s a fast-moving rush of a game that’s constantly throwing something new and interesting at you. And it really is relentlessly cinematic, replete with stylish little touches. Even when it’s working with just text, words often stutter onto the screen in clumps to mimic conversation, or are pecked out character by character when they’re coming through your satellite computer hookup. There’s a sense that things could go in any direction, that anything could be asked of you next, rules of computer-game genres be damned. If that sounds appealing, by all means download it, fire up your Apple II emulator, and give it a go.


Tags: , , ,

From Congo to Amazon

There are new ways of presenting information other than the traditional ways in which the reader or viewer is required to be passive. A few years ago, I realized that I didn’t know about these things, and that I’d better find out about them. The only way I could learn was to actually go and do one. So I said, “Well, I’ll just make a game and then I’ll learn.” And I certainly did.

– Michael Crichton, 1984

Anyone who had been reading Michael Crichton’s novels prior to the founding of the Telarium brand had to know of his interest in computers. The plot of 1972′s The Terminal Man, of a man who has a computer implanted in his brain, is the sort of thing that would become commonplace in science fiction only with the rise of cyberpunk more than a decade later. And of course computers are also all over 1980′s Congo; indeed, they’re the only reason the heroes are out there in the jungle in the first place. Crichton’s personal history with computers also stretches back surprisingly far. Always an inveterate gadget freak, he bought his first computer-like machine in the form of an Olivetti word processor almost as soon as his earnings from his first hit novel, The Andromeda Strain, made it possible. He wrote his books for years on the Olivetti. When the trinity of 1977 arrived, he quickly jumped aboard the PC revolution with an Apple II, first of a stable that within a few years would also include Commodores, Radio Shacks, and IBMs.

Never shy about sharing his interests in print, Crichton became a semi-regular contributor to Creative Computing magazine, who were thrilled to have a byline of his prominence under any terms. Thus they gave him free rein to opine in the abstract:

I would argue that it [computer technology] is a force of human evolution, opening new possibilities for our minds, simultaneously freeing us from drudgery while presenting us with a parody of our own rational sides. Computers actually show us both the benefits and the limits of rationality with wonderful precision. What could be more rational than that pedantic little box that keeps saying SYNTAX ERROR over and over? And what does our frustration suggest to us, in terms of other things to do and other ways to be?

But Crichton was more than the mere dabbler that poeticisms like the above might suggest. He took the time to learn how to program his toys, publishing fairly intricate program listings in BASIC for applications such as casting the I Ching (a byproduct of his seldom remarked interest in mysticism; see his nonfiction memoir Travels, which might just be the most interesting thing he ever wrote); identifying users based on their typing characteristics (inspired by his recent short story “Mousetrap”); and creating onscreen art mirroring that of abstract painter Josef Albers (Crichton’s interest in and patronship of the visual arts also tends to go unremarked). In 1983 he published the book Electronic Life: How to Think About Computers, a breezy introduction for the layman which nevertheless shared some real wisdom on topics such as the absurdity of the drive for “computer literacy” which insisted that every schoolchild in the country needed to know how to program in BASIC to have a prayer of success in later life. It also offered a spirited defense of computer as tools for entertainment and creativity as well as business and other practical matters.

Which isn’t to say that he didn’t find plenty of such practical applications for his computers. During this part of his life Crichton was immersed in planning for a movie called Runaway, which was to star Tom Selleck and Gene Simmons of Magnum P.I. and Kiss fame respectively. He hoped it would be one of the major blockbusters of 1984, although it would ultimately be overshadowed by a glut of other high-profile science-fiction films that year (The Terminator, Star Trek III, 2010). He hired a team to create a financial-modeling packaging which he claimed would allow a prospective filmmaker to input a bunch of parameters and have a shooting budget for any movie in “about a minute.” It was soon circulating amongst his peers in Hollywood.

Thus when the folks at Telarium started thinking about authors who might be interested in licensing their books and maybe even working with them on the resulting adaptations, Crichton was a natural. Seth Godin approached him in late 1983. He returned with extraordinary news: not only was Crichton interested, but he already had a largely completed game for them, based on his most recent novel, Congo.

Crichton had first started thinking he might like to write a game as long as two years before Godin’s inquiry. He’d grown frustrated with the limitations of the adventure games he’d played, limitations which seemed to spring not just from the technology but also from the lack of dramatic chops of their programmers.

I simply didn’t understand the mentality that informed them. It was not until I began programming myself that I realized it was a debugger’s mentality. They could make you sit outside a door until you said exactly the right words. Sometimes you had to say, “I quit,” and then it would let you through.

Well, that’s life in the programming world. It’s not life in any other world. It’s not an accepted dramatic convention in any other arena of entertainment. It’s something you learn to do when you’re trying to make the computer work.

Here’s what I found out early on: you can’t have extremely varied choices that don’t seem to matter. I can go north, south, east, or west, and who cares? You can only do that for a while, and then if you don’t start to have an expectation of what will happen, you’ll stop playing the game. You’d better get right going and you’d better start to have something happen.

If I play a game for a half-hour and it doesn’t make any sense to me, I’ll just quit and never go back. Say I’m locked in this house and I don’t know what the point of the house is and why I can’t get out and there’s no sort of hint to me about the mentality that would assist me in getting out — I don’t know. I could say “Shazam!” or I could burn the house down or — give me a break. I just stop.

Crichton started to sketch out his own adventure game based on Congo, whose simple quest plot structure made it a relatively good choice for conversion to the new format. Realizing that his programming skills weren’t up to the task of implementing his ideas, he hired programmer Stephen Warady to write the game in Apple II assembly language. The little team was eventually completed by David Durand, an artist who normally worked in film graphics. The game as it evolved was as much a mixed-media experience as text adventure, incorporating illustrations, simple action games, and other occasional graphical interludes that almost qualify as cut scenes, perfectly befitting this most cinematic of writers (and, not incidentally, making the game a perfect match with Telarium’s other games once they finally came calling). Crichton would sometimes program these sequences himself in BASIC, then turn them over to Warady to redo in much faster assembly language. Given Crichton’s other commitments, work on Congo the game proceeded in fits and starts for some eighteen months. They were just getting to the point of thinking about a publisher when Godin arrived to relieve them of that stress.

When Spinnaker started their due diligence on the deal, however, a huge problem quickly presented itself: Crichton, as was typical for him by this time, had already sold the media rights to Congo to Hollywood. (After they languished there for many years, the success of the Jurassic Park film would finally prompt Paramount Pictures to pick them up and make a Congo movie at last in 1995. Opinions are divided over whether that movie was just bad or so cosmically bad that it became good again.) Those rights unfortunately included all adaptations, including computer games, something the usually business-savvy Crichton had totally failed to realize. Spinnaker may have been a big wheel in home computers, but they didn’t have much clout in Hollywood. So, they came up with another solution: they excised the specifics of the novel from the game, leaving just the plot framework. The Congo became the Amazon; Amy the signing gorilla became Paco the talking parrot; Earth Resources Technology Services became National Satellite Resources Technology; a diamond mine became an emerald mine; African cannibals and roving, massacring army troops became South American cannibals and roving, massacring army troops. It may not have said much for Crichton and Spinnaker’s appreciation for cultural diversity, but it solved their legal problems.

Amazon was written for the Apple II in native assembly language. Spinnaker, however, took advantage of the rare luxury of time — the game was in an almost completed state when Crichton signed in late 1983, fully one year before the Telarium line’s launch — to turn it over to Byron Preiss Video Productions to make a version in SAL for the all-important Commodore 64 platform. The result wasn’t quite as nice an experience as the original, but it was acceptable. And it was certainly a wise move: Amazon became by all indications the most successful of all the Telarium games. Some reports have it selling as many as 100,000 copies, very good numbers for a member of a line whose overall commercial performance was quite disappointing. The majority of those were most likely the Commodore 64 version, if sales patterns for Amazon matched those for the industry as a whole.

I do want to talk about Amazon in more detail; it’s an historically important game thanks if nothing else to Crichton’s involvement and also a very interesting one, with some genuinely new approaches. But we’ll save that discussion for next time. In the meantime, feel free to download the Apple II version from here if you’d like to get a head start. Note that disk 3 is the boot disk.

(All of the references I listed in my first article on bookware still apply. Useful interviews with Crichton appeared in the February 1985 Creative Computing and February 1985 Compute!. Other articles and programs by Crichton appeared in Creative Computing‘s March 1983, June 1984, and November 1984 issues.)


Posted by on October 11, 2013 in Digital Antiquaria, Interactive Fiction


Tags: , , , ,

A Computer for Every Home?

On January 13, 1984, Commodore held their first board of directors meeting of the year. It should have been a relaxed, happy occasion, a time to make plans for the new year but also one last chance to reflect on a stellar 1983, a year in which they had sold more computers than any two of their rivals combined and truly joined the big boys of corporate America by reaching a billion dollars in gross sales. During the last quarter of 1983 alone they had ridden a spectacular Christmas buying season to more than $50 million in profits. Commodore had won the Home Computer Wars convincingly, driving rival Texas Instruments to unconditional surrender. To make the triumph even sweeter, rival Apple had publicly announced the goal of selling a billion dollars worth of their own computers that year, only to fall just short thanks to the failure of the Lisa. Atari, meanwhile, had imploded in the wake of the videogame crash, losing more than $500 million and laying off more than 2000 workers. Commodore had just the previous summer moved into a sprawling new 585,000 square-foot, two story headquarters in West Chester, Pennsylvania that befitted their new stature. (Some of the manufacturing spaces and warehouses in the place were so large that Commodore veterans insist today that they had their own weather.) Yes, it should have been a happy time at Commodore. But instead there was doubt and trepidation in the air as executives filed into the boardroom on that Friday the 13th.

A day or two before, Jack Tramiel had had a heated argument with Irving Gould, Commodore’s largest shareholder and the man who controlled his purse strings, in the company’s private suite above their exhibit at the 1984 Winter Consumer Electronics Show. That in itself wasn’t unusual; these two corrupt old bulldogs had had an adversarial relationship for almost two decades now. This time, however, observers remarked that Gould was shouting as much as Tramiel. That was unusual; Gould normally sat impassively until Tramiel exhausted himself, then quietly told him which demands he was and wasn’t willing to meet. When Tramiel stormed red-faced out of the meeting and sped away in the new sports car he’d just gotten for his 55th birthday, it was clear that this was not just the usual squabbling. Now observers outside the board-of-directors meeting, which was being chaired as usual by Gould, saw him depart halfway through in a similar huff. He would never darken Commodore’s doors again.

No one who was inside that boardroom has ever revealed exactly what transpired there. With Gould and Tramiel both now dead and the other former board members either dead or aged, it’s unlikely that anyone ever will. On the face of it, it seems hard to imagine. What could cause these two men who had managed to stay together through the toughest of times, during which Commodore had more than once teetered on the edge of bankruptcy, to irrevocably split now, when their company had just enjoyed the best year in its history? We can only speculate.

Commodore had ceased truly being Tramiel’s company in 1966, when Gould swooped in to bail him out from the Financial Acceptance Scandal of the previous year. Tramiel, however, never quite got the memo. He continued to run the company like a sole proprietor to whatever extent that Gould would let him. Tramiel micro-managed to an astonishing degree. He did not, for instance, believe in budgets, considering them a “license to steal,” a guarantee that the responsible manager, knowing he had X million available, would always spend at least X million. Instead he demanded that every expenditure of greater than $1000 be approved personally by him, with the result that much of the company ground to a halt any time he took a holiday. Even as Tramiel enjoyed his best year ever in business, Gould and others in the financial community were beginning to ask the very reasonable question of whether this was really a sustainable way to run a billion-dollar company.

Still, the specific cause of Tramiel’s departure seems likely to have involved his sons. Tramiel valued family above all else, and, like a typical small businessman, dreamed of leaving “his” company to his three sons. Whether by coincidence or something else, it even worked out that each son had an area of expertise that would be critical to running a company like Commodore. Sam, the eldest, had trained in business management at York University, while Gary, the youngest, was a financial analyst with a degree from Manlow Park College and experience as a stockbroker at Merrill Lynch. Leonard, the middle child, was the intellectual and the gearhead; he was finishing a PhD in astrophysics at Columbia, and was by all accounts quite an accomplished hardware and software hacker. Sam and Gary already worked for Commodore, while Leonard planned to start as soon as he finished his PhD in a few more months. Various witnesses have claimed that Tramiel the elder now wished to begin more actively grooming this three-headed monster to take more and more of his responsibilities, and someday to take his place. Feeling nothing good could come out of such blatant nepotism inside a publicly traded corporation that was trying to put its somewhat seedy history behind it, Gould refused absolutely to countenance such a plan. Given Tramiel’s devotion to his family and his attitude toward Commodore as his personal fiefdom, it does make a degree of sense that this particular rejection might have been more than he could stomach.

In any case, Tramiel was gone, and Gould, who had made his fortune in the unglamorous world of warehousing and shipping and was reportedly both a bit jealous of Tramiel’s high profile in an exciting, emerging industry and a bit embarrassed by his gruff, untutored ways, didn’t seem particularly distraught about it. The man he brought in to replace him could hardly have been more different. Marshall F. Smith was a blandly feckless veteran of boardrooms and country clubs who had spent his career in the steel industry. It’s hard to grasp just why Gould latched onto Smith of all people. Perhaps he was following the lead of Apple, who the previous year had brought in their own leader from outside the computer industry, John Sculley. Sculley, however, understood consumer marketing, having cut his teeth at Pepsi, where he was the mastermind behind the Pepsi Challenge, still one of the most iconic and effective advertising campaigns in the long history of the Cola Wars. The anonymous world of Big Steel offered no comparable experience. Smith’s appointment was the first of a long string of well-nigh incomprehensible mistakes Gould would make over the next decade. Engineers that were initially thrilled to have proper funding and actual budgets at last were soon watching with growing concern as Smith puttered about with a growing management bureaucracy and let the company drift without direction. Many were soon muttering that it’s often better to make a decision — even the wrong decision — than to just let things hang. Whatever else you could say about Jack Tramiel, he never lacked the courage of his convictions.

Commodore’s first significant new models, which reached stores at last in October of 1984, more than two years after the Commodore 64, hardly did much to inspire confidence in the new regime. Nothing about the Commodore 16 and the Plus/4 made any sense at all. The 16 was an ultra-low-end model with just 16 K of memory, long after the time for such a beast had passed. The trend in even inexpensive 8-bit computers was onward, toward the next magic number of 128 K, not backward to the late 1970s.

The Commodore Plus/4

The Commodore Plus/4

As for the Plus/4, which like the 64 was built around a variant of the 6502 CPU and had the same 64 K of memory but was nevertheless incompatible… well, it was the proverbial riddle wrapped in a mystery inside an enigma. It was billed as a more “serious” machine than the 64, a computer for “home and business applications” rather than gaming, and priced to match at about $300, more than $100 more than the 64. It featured four applications built right into its ROM (thus the machine’s name): a file manager, a word processor, a spreadsheet, and a graphing program. All were pathetically sub-rate even by the standards of Commodore 64 applications, hardly the gold standard in business computing. The Plus/4 lacked the 64′s sprites and SID sound chip, which made a degree of sense; for a dismaying number of years yet a lack of audiovisual capability would be taken as a signifier of serious intent in computing. But why did it offer more colors, 128 as opposed to the 64′s 16? And as an allegedly more serious computer, why didn’t it offer the 80-column display absolutely essential for comfortable word processing and other typical productive tasks? And as a more serious (and expensive) computer, why did it have a rubbery keyboard almost as awful to type on as the IBM PCjr’s Chiclet model? And would all those serious, more productive buyers really be doing a lot of BASIC programming? If not, why was one of the main selling points a much better BASIC than the bare-bones edition found in the 64? Info, a magazine that would soon built a reputation for saying the things about Commodore’s bizarre decisions that nobody else would, gave the Plus/4 a withering review:

The biggest problem with the Plus/4 is the fundamental concept: an 8-bit, 64 K, 40-column desktop personal computer. Commodore already makes the best 8-bit, 64 K, 40-column desktop personal computer you can buy, with literally thousands of products supporting it! Why should consumers want a “new” machine with no significant advances, several new limitations, and virtually no third-party product support? And why would a company with no competition in the under-$500 category bring out an incompatible [machine] that can’t compete with anybody’s machine except their own? It just doesn’t compute!

Info ran a wonderfully snarky contest in the same issue, giving away the Plus/4 they’d just reviewed. After all, it was “sure to become a collector’s item!” Even the more staid Compute!’s Gazette managed to flummox a poor Commodore representative with a single question: “Why buy a 264 [a pre-release name for the Plus/4] instead of a 64 that has a word processor and, say, a Simon’s BASIC? It would be the equivalent of the 264 for less money.” Commodore happily claimed that the Plus/4 had enough utility built right in for the “average small business” (maybe they meant one of the vast majority that fail within a year or two anyway), but in reality it seemed like it had been cobbled together from spare parts that Commodore happened to have lying around. In fact, that’s not far from what happened — and Tramiel actually bears as much responsibility for the whole fiasco as the clueless Marshall Smith.

Tramiel, you’ll remember, had driven away the heart of his engineering team in his usual hail of recriminations and lawsuits shortly after they had created the 64 for him. He did eventually find more talented young engineers, notably Bil Herd and Dave Haynie. (Commodore always preferred their engineers young and inexperienced because that way they didn’t have to pay them much — a strategy that sometimes backfired but was sometimes perversely successful, netting them brilliant, unconventional minds who would have been overlooked by other companies.) When Herd arrived at Commodore in early 1983, engineers had been tinkering for some time with a new video and audio chip, the TED (short for Text Display). With engineering straitened as ever by Tramiel’s aversion to spending money, the 23-year-old Herd soon found himself leading a project to make the TED the heart of a new computer, despite the fact that it was in some ways a step back, lacking the sprites of the 64′s VIC chip and the marvelous sound capabilities of its SID chip. Marketing came up with the dubious idea of including applications in ROM, which by all accounts delighted Tramiel.

Tramiel, who at some fundamental level still thought of the computers he now sold like the calculators he once had, failed to grasp that the whole value of a computer is the ability to do lots of different things with it, to have lots and lots of options its designers may never have anticipated, all through the magic of software. Locking applications into ROM, making them impossible to replace or update, was kind of missing the point of building a computer in the first place. Failing to understand that a computer is only as good to consumers as the quality and variety of its available software, Tramiel also saw no problem with making the new machine incompatible with the 64. It seems to have come as a complete surprise to him when the machine was announced at that fateful Winter CES and everyone’s first question was whether they could use it to run the Commodore 64 software they already had.

After Tramiel’s abrupt departure, Commodore pushed ahead with the 16 and Plus/4 in the muddled way that would be their wont for the rest of the company’s life, despite a skeptical press and utterly indifferent consumers. It all made so little sense that some have darkly hinted of a conspiracy hatched by Tramiel amongst his remaining loyalists at Commodore to get the company to waste resources, time, and credibility on these obvious losers. (Tramiel recruited a substantial number of said loyalists to join him after he purchased Atari and got back in the home-computer game — exactly the sort of thing for which he so often sued others. But that’s a story for a later article.) Incredibly given the cobbled nature of the machine, it took nine more months after that CES to finally get the 16 and Plus/4 into production and watch them duly flop. Again, such a glacial pace would prove to be a consistent trait of the post-Tramiel Commodore.

By the time they did appear at last, the poor, benighted 16 and Plus/4 had more working against them than just their own failings, considerable as those may have been. The year as a whole was marked by failures in the home-computer segment of the market. Atari was reeling. Coleco was taking massive losses on their tardy entry into the home-computing field, the Adam. And of course I’ve already told you about the IBM PCjr.

Even Apple, who had enjoyed a splashy, successful launch of their new higher-end Macintosh (another story for a later date), had a somewhat disappointing new model amongst their bread-and-butter Apple II line. The “c” in the the Apple IIc’s name stood for “compact,” and it was indeed a much smaller version of Steve Wozniak’s old evergreen design. Like the Macintosh, it was a closed system designed for the end user who just wanted to get work (or play) done, not for the hackers who had adored the earlier editions of the II with their big cases and heaps of inviting expansion slots. The idea was that you would get everything you, the ordinary user, really needed built right in: all of the fundamental interface cards, a disk drive, a full 128 K of memory (as much as the Macintosh), etc. All you would really need to add to have a nice home-office setup was a monitor and a printer.

The Apple IIc

The Apple IIc

But the IIc was not envisioned just as a more practical machine: as the only II model after the first with which Steve Jobs played an important role, it evinced all of his famous obsession with design. Indeed, much of the external look and sensibility that we associate with Apple today begins as much here as with the just slightly older — and, truth be told, just slightly clunkier-looking — first Macintosh model. The Apple IIc was the first product of what would turn into a longstanding partnership with the German firm Frog Design. It marks the debut of what Apple referred to as the “Snow White” design language — slim, modern, sleek, and, yes, white. Everything about the IIc, including the packaging and the glossy manuals inside, oozed the same chic elegance.

Apple introduced the IIc at a lavish party and exhibition in San Francisco’s Moscone Center in April of 1984, just three months after a similar shindig to launch the Macintosh. The name was chosen to mollify restless Apple II owners who feared — rightly so, as it would turn out; even at “Apple II Forever” Jobs made time for a presentation on “The First 100 Days of Macintosh” — that Sculley, Jobs, and their associates had little further interest in them. Geniuses that they have always been for burnishing their own myths, Apple built a museum right there in the conference center, its centerpiece a replica of the garage where it had all begun. The IIc unveiling itself was an audiovisual extravaganza featuring three huge projection screens for the music video Apple had commissioned for the occasion. The most dramatic and theatrical moment came when Sculley held the tiny machine above him onstage for the first time. As the crowd strained to see, he asked if they’d like a closer look. Then the house lights suddenly came up and every fifth person in the audience stood up with an Apple IIc of her own to show and pass around.

Apple confidently predicted that they would soon be selling 100,000 IIcs every month on the strength of the launch buzz and a $15 million advertising campaign. In actuality the machine averaged just 100,000 sales per year over its four years in Apple’s product catalogs. The old, ugly IIe outsold its fairer sibling handily. This left Apple in a huge bind for a while, for they had all but stopped production of the IIe in anticipation of the IIc’s success while wildly overproducing IIcs for a rush that never materialized. Thus for some time stores were glutted with the IIcs that consumers didn’t want and couldn’t get their hands on the IIes that they did. (It’s interesting to consider that the PCjr almost certainly sold more units than the IIc, which has never been tarred with the label of outright flop, during each machine’s first year on the market. Narratives can be funny things.)

It remains even today somewhat unclear why the world never embraced the IIc as it had the three Apple II models that preceded it. There’s some evidence to suggest that consumers, not yet conditioned to expect each new generation of computing technology to be both smaller and more powerful than the previous, took the IIc’s small size to be a sign that it was not as serious or powerful as the IIe. Apple was actually aware of this danger before the IIc debuted. Thus the advertising campaign worked hard to explain that the IIc was more powerful than its size would imply, with the tagline, “Announcing a technological breakthrough of incredible proportions.” Yet it’s doubtful whether this message really got through. In addition, the IIc was, like the PCjr, an expensive proposition for the home-computer buyer: almost $1300, $300 more than a basic IIe. For that price you got twice the memory of the IIe as well as various other IIe add-on options built right in, but the value of all this may have been difficult for the novice buyer, the IIc’s main target, to grasp. She may just have seen that she was being asked to pay more for a smaller and thus presumably less capable machine, and gone with the bigger, more serious-looking IIe (if anything from Apple).

Then again, maybe the IIc was just born under a bad sign. As I’ve already noted, nobody was having much luck with their new home computers in 1984, almost regardless of their individual strengths and weaknesses.

But why was this trend so universal? That’s what people inside the industry and computer evangelists outside it were asking themselves with increasing urgency as the year wore on. As 1984 drew toward a close, the inertia began to affect even the most established warhorses, the Commodore 64 and the Apple IIe. Both Commodore and Apple posted disappointing Christmas numbers, down at least 20% from the year before, and poor Commodore, now effectively a one-product company completely reliant on continuing sales of the 64, sunk back well below that magic billion-dollar threshold again. In the grand scheme of things the Commodore 64 was still a ridiculously successful machine, by far the bestselling computer in the world and the preeminent gaming platform of its era. Yet there increasingly seemed to be something wrong with the home-computer revolution as a whole.

Commodore 64 startup screen

The fact was that a backlash had been steadily building almost from the moment that the spectacular Christmas 1983 buying season had ended. Consumers had begun to say, and not without considerable justification, that home computers promised far more than they delivered. Watching all those bright, happy faces in television and print advertising, people had bought computers expecting them to do the things that the computers there were doing. As Commodore’s advertising put it, “If you’re not pleased with what’s on your TV set tonight, simply turn on your Commodore 64.” Yet what did you get when you turned on your 64 — after you figured out how to connect it to your TV in the first place, that is? No bright fun, just something about 38,911 somethings, a READY prompt, and a cryptically blinking cursor. Everything about television was easy; everything about computers was hard. Computers had been sold to consumers like any other piece of consumer electronics, but they were not like any other piece of consumer electronics. For the vast majority of people — those who had no intrinsic fascination with the technology itself, who merely wanted to do the sorts of things those families on TV were doing — they were stubborn, frustrating, well-nigh intractable things. Ordinary consumer were dutifully buying computers, but computers were at some fundamental level not yet ready for ordinary consumers.

The computer industry was still unable to really answer the question which had dogged and thwarted it ever since Radio Shack had run the first ads showing a happy housewife sorting her recipes on a TRS-80 perched on the kitchen table: why do I, the ordinary man or woman with children to feed and a job to keep, need one? Commodore had cemented the industry’s go-to rhetoric with the help of William Shatner in their VIC-20 advertising campaign that first carved out a real market segment for home computers. You needed a computer for productivity tasks and for your children’s future, “Johnny can’t read BASIC” having replaced “Johnny can’t read” as the marker of a neglectful parent. Entertainment was relegated to an asterisk at the end: “Plays great games too!”

Yet, honestly, how productive could you really be with even the Commodore 64, much less the 5 K VIC-20? Some people did manage to do productive things with their 64s, but most of those who did forgot or decided not to ask themselves a simple question: is doing this on the computer really easier than the alternative? The answer was almost always no. Hobbyists chose to do things on the computer because it was cool, not because it was practical. Never mind if it took far more effort to keep one’s address book on the Commodore 64, what with its slow disk drive and quirky, unrefined software, than it would have to just have a paper card file. Never mind if it was much riskier as well, prone to deletion by an errant key swipe or a misbehaving disk drive. It was cooler, and that was all that mattered — to a technology buff. Most other people found it easier to address their Christmas cards by hand than to try to feed envelopes through a tractor-fed dot-matrix printer that made enough noise to wake the neighbors.

Perhaps the one possible compelling productive use of a machine like the Commodore 64 in the home was as a word processor. Kids today can’t imagine how students once despaired when their teachers told them that a report had to be typed back in the era of typewriters, can’t conceive how difficult it was to get anything on paper in typewritten form when every mistake made by untutored fingers meant trying to decide between pulling out the Liquid Paper or just starting all over again. But even word processing on the 64 was made so painful by the 40-column screen and manifold other compromises that there was room to debate whether the cure was worse than the disease. Specialized hardware-based word processors became hugely popular during this era for just this reason. These single-function, all-in-one devices were much more pleasant to use than a Commodore 64 equipped with a $30 program, and cheaper than buying a whole computer system, especially if you went with a higher priced and thus more productively useful model like the Apple II.

The idea that every child in America needed to learn to program, lest she be left behind to flip burgers while her friends had brilliant careers, was also absurd on the face of it. It was akin to declaring during the days of the Model T that every citizen needed to learn to strip down and rebuild one of these newfangled automobiles. Basic computer literacy was important (and remains so today); BASIC literacy was not. What a child really needed to know could largely be taught in school. Parents needn’t have fretted if Junior preferred reading, listening to music, playing sports, or practicing origami to learning the vagaries of PEEKs and POKEs in BASIC 2.0. There would be time enough for computing when computing and Junior had both grown up a bit.

So, everything had changed yet nothing had changed since the halcyon days of the trinity of 1977. Computers were transforming the face and in some cases the very nature of business, yet there remained just two compelling reasons to have one in the home: 1) for the sheer joy of hacking or 2) for playing games. Lots more computers were now being used for the latter than the former, thanks to the vastly more and vastly better games that were now available. But for many folks games just weren’t a compelling enough reason to own one. The Puritan ethic that makes people feel guilty of their pleasures was as strong in America then as it remains today. It certainly didn’t help that the media had been filled for several years now with hand-wringing about the effect videogames were having on the psyches of youngsters. (This prompted many computer publishers of this period to work hard, albeit likely with limited success, to label their computer games as something different, something more cerebral and rewarding and even, dare we say it, educational than their simplistic videogame cousins.)

But, perhaps most of all, computers still remained quite expensive when you really dug into everything you needed for a workable system. Yes, you could get a Commodore 64 for less than $200 by the Christmas of 1983. But then you needed a disk drive ($220) if you wanted to do, well, much of anything with it; a monitor ($220) if you wanted a nice picture and didn’t want to tie up the family television all the time; a printer ($290) for word processing, if you wanted to take that fraught plunge; a modem ($60) to go online. It didn’t take long until you were approaching four digits, and that’s without even entering into a discussion of software. There was thus a certain note of false advertising in that sub-$200 Commodore 64. And because these machines were being sold through mass merchandisers rather than dealers, there was no one who really knew better, who could help buyers to put a proper system together at the point of sale. Consumers, conditioned by pretty much everything else that was sold to them not to expect the 64 on its own to be pretty much useless, were often baffled and frustrated when they realized they had bought an expensive doorstop. Many of the computer sold during that Christmas of 1983 were turned on a few times only, then consigned to the back of the closet or attic to gather dust. The bad taste they put in many people’s mouths would take years to go away. Meanwhile the more complete, useful machines, like the Apple IIc and the PCjr, were still more expensive than a complete Commodore 64 system — and the games on them weren’t as good to boot. Hackers and passionate gamers (or, perhaps more commonly, their generous parents) were willing to pay the price. Curious novices largely were not. Faced with no really good all-purpose options, many — most, actually — soon decided home computers just weren’t worth it. The real home-computer revolution, as it turned out, was still almost ten years away. About 15% of American homes had computers — at least ostensibly; many of them were, as just mentioned, buried in closets — by January 1, 1985, but that figure would rise with agonizing slowness for the rest of the decade. People could still live perfectly happy lives fully plugged into the cultural discourse around them and raise healthy, productive children in the process without owning a computer. Only much later, with the arrival of the World Wide Web and computers equipped with more intuitive graphical user interfaces for accessing it, would that change.

Which is not to say that the software and information industries that had exploded in and around the home-computer revolution during 1982 and 1983 died just like that. Many of its prominent members, however, did, as the financial gambles they had taken in anticipation of the home-computer revolution came back to haunt them. We’ve just seen how Sierra nearly went under during this period. Muse Software and Scott Adams’s Adventure International, to name two other old friends from this blog, weren’t so lucky; both folded in 1985. Electronic Arts survived, but steered their rhetoric and choice of titles somewhat away from Trip Hawkins’s original vision of “consumer software” toward titles tilted more toward the hardcore, in proven hardcore genres like the CRPG and the adventure game.

Magazines were even harder hit. By early 1984 there were more than 300 professionally published computing periodicals of one sort or another, many of them just founded during the boom of the previous year. Well over half of these died during 1984 and 1985. Mixed in with the dead Johnny-come-latelys were some cherished veteran voices, among them pioneers Creative Computing (1974), SoftSide (1978), and Softalk (1980). The latter’s demise, after exactly four years and 48 issues of sometimes superb people-focused journalism, came as a particular blow to the Apple II community; Apple historian Steven Weyhrich names this moment as nothing less than the end of the “golden age” of the Apple II. Those magazines that survived often did so in dramatically shrunken form. Compute!, for instance, went from 392 pages in December of 1983 to 160 ten months later.

Yet it wasn’t all doom and gloom. Paradoxically, some software publishers still did quite well. Infocom, for example, had the best single year in their history in 1984 in terms of unit sales, selling almost 750,000 games. It seemed that, with more options than ever before, software buyers were becoming much more discerning. Those publishers like Infocom who could offer them fresh, quality products showing a distinctive sensibility could do very well. Those who could not, like Adventure International with their tired old two-word parsers and simplistic engines, suffered the consequences. That real or implied asterisk (“Plays great games too!”) at the end of the advertising copy remained the great guilty secret of the remaining home-computer industry, the real reason computers were in homes at all. Thankfully, the best games were getting ever more complex and compelling; otherwise the industry may have been in even more trouble than it actually was.

Indeed, with a staggering number of machines already out there and heaps still to be sold for years to come, the golden age for Commodore 64 users was just beginning. This year of chaos and uncertainty was the year that the 64 really came into its own as a games machine, as programmers came to understand how to make it sing. Companies who found these keyboard maestros would be able to make millions from them. The home-computer revolution may not have quite panned out as anticipated and the parent company may have looked increasingly clueless, but for gamers the Commodore 64 stood alone with its combination of audiovisual capability, its large and ever growing catalog of games, and its low price. What with game consoles effectively dead in the wake of Atari’s crash and burn, all the action was right here.

In that spirit, we’ll look next time at the strange transformation that led one of our stodgiest old friends from earlier articles to become the hip purveyor of some of the slickest games that would ever grace the 64.

(The indispensable resources on Commodore’s history remains Brian Bagnall’s On the Edge and its revised edition, Commodore: A Company on the Edge. Frank Rose’s West of Eden is the best chronicle I know of this period of Apple’s history. The editorial pages and columnists in Compute! and Compute!’s Gazette provided a great unfolding account of a chaotic year in home computing as it happened. Particular props must go to Fred D’Ignazio for pointing out all of the problems with the standard rhetoric of the home-computer revolution in Compute!‘s May 1984 issue — but he does lose points for naming the PCjr as the answer to all these woes in the next issue.)


Tags: ,

The Laser Craze

Dragon's Lair

In my last article I described some of the pioneering early work done in multimedia computing with the aid of the new technology of the laser disc. These folks were not the only ones excited by the laser disc’s potential. Plenty of others, at least some of them of a somewhat more, shall we say, mercenary disposition, considered how they could package these advancements into a form practical and inexpensive enough for home users. They dreamed of a new era of interactive video entertainment to supplement or even replace the old linear storytelling of traditional television. Tim Onosko described the dream in an article in the January 1982 issue of Creative Computing that reads like a scene from L.A. Noire:

The scene is your living room. You’re watching a television program — let’s say it’s a cop show. A policeman is questioning a man suspected of committing a crime. The suspect answers in a barely audible tone, and his words come slowly. The policeman finishes his interrogation, then turns to the camera and asks you a question: should we believe him?

On a hand-held remote control, you press a button indicating that you doubt the suspect’s story. The cop consults you again, this time offering three possibilities.

Do you think the suspect was:

A) lying?
B) concealing important facts?
C) in shock and unable to communicate accurately?

The core of this idea was decades old even in 1982. Before she became the voice of Objectivism, Ayn Rand first attracted literary notice for her 1934 play Night of January 16th, a courtroom drama with a twist: members of the audience get to play the jury, deciding whether the defendant is guilty or not guilty. The ending of the play naturally depends on their choice. In 1967 the Czech director Radúz Činčera debuted his film Kinoautomat, in which the audience gets to vote on what happens next — and thus which scene is played next — nine times over the course of a viewing. On a less high-brow note, various cable operators during the 1970s experimented with what might be seen as the ancestors of American Idol, allowing the audience to vote their preferences via telephone. The main thing that distinguishes the scheme that Onosko describes above is that it places the individual, rather than a voting group, in control. In the dawning age of personal computing, that was no small distinction.

Still, interactive-video visionaries faced an uphill climb made even steeper by the very people who championed the laser disc as the next big thing in traditional home video. The legal team at MCA, co-developer of the laser disc, declared that their contracts with the Screen Actors Guild made it illegal to offer any sort of interactive features on their movie discs; they could only sell movies to be “viewed straight through.” Pioneer, the electronics brand with by far the biggest commercial presence in laser discs, didn’t even bother with excuses. They were simply uninterested in the various proposals from interested developers, not even deigning to reply to most. These big companies insisted on seeing the laser disc as a videocassette with better picture and sound, a difficult sale to make in light of the format’s other, very real disadvantages to the videocassette. Meanwhile its really revolutionary qualities, while not quite going unnoticed — Pioneer and others did make and sell industrial-grade players and the laser discs they played to the institutional projects I described in my last article as well as many more — were deemed of little ultimate significance in the consumer market.

Denied the industry sanction that might have made of the interactive laser disc a real force in consumer electronics, hobbyists and small developers did the best they could. A tiny company called Aurora Systems developed and sold an interface between an Apple II and the most popular of the early consumer-grade laser-disc players, the Pioneer VP-1000. Even so, those dreaming of a hobbyist-driven market for interactive laser-disc entertainment akin to that of the early general software market would be disappointed. A fundamental problem prevented it: even hobbyists with the equipment and the skill to shoot video sequences for their productions had no way to get it onto the read-only medium of the laser disc.

Admittedly, some went to great lengths to try to get around this. In that same January 1982 issue, which was given over almost entirely to the potential of the interactive laser disc and multimedia computing, Creative Computing published a fascinating experiment as a type-in program. Rollercoaster is a text adventure that requires not only an Apple II, a VP-1000, and the Aurora interface but also a laser disc of the 1977 movie of the same name. You, like George Segal in the film, are tasked with trying to stop a madman from blowing up a roller coaster. The game opens by playing an intro sequence from the laser disc interspersed with text. You see the madman planting the bomb, and see the airplane carrying you, a detective, arriving on the scene. Most of the locations you enter once the game proper begins are illustrated by a single frame judiciously chosen from the movie, and various actions are rewarded with a snippet of footage.

Flow chart for the game Rollercoaster, showing where video sequences and still frames appear

Flow chart for the game Rollercoaster, showing where video sequences and still frames appear

Rollercoaster, written by David Lubar with major design contributions from Creative Computing‘s publisher and editor David Ahl, is almost certainly the first computer game to incorporate what would come to be called the cut scene. It’s also the first to incorporate video footage from the real world, what would come to be given the tag of “full-motion video,” harbinger of a major if relatively short-lived game-industry craze of the 1990s. Still, its piggybacking on the film of another was legally problematic at best, and obviously inapplicable to a boxed-commercial-software industry. The fundamental blockage –that of lacking the resources to make original laser-disc content — remained. And then along came a Southern Californian named Rick Dyer, who made the deal that would bring interactive video to the masses.

Like so many others, Dyer had found his first encounter with Crowther and Woods’s Adventure a life-changing experience. Even as he built a lucrative career as a videogame programmer for Mattel and Coleco in the late 1970s and early 1980s, he dreamed of doing an adventure game in multimedia. Dissatisfied with the computer graphics available to him, he cast about far and wide for an alternative that would be more aesthetically pleasing. His first attempt was a sort of automated version of the Choose Your Own Adventure books that would soon be huge in children’s publishing. It consisted of a roll of tape upon which was printed text and pictures. As the player made choices using a keyboard, the controlling computer would shuttle the tape back and forth to expose the correct “page” for reading and viewing. Next he created a setup built around a computer-controlled slide projector, with a computer-controlled tape player used to play snippets of audio to accompany each slide. He also tried a complicated VCR setup, in which a videotape was laboriously rewound and fast-forwarded to find the next scenes needed by the game. When laser-disc players began to arrive in numbers, he felt he had the correct format at last. He started a company of his own, Advanced Micro Systems, and went to various toy companies with an idea he called The Fantasy Machine, a sort of interactive storybook for children. He found no takers. But then he met two partners just desperate enough to listen to his ideas.

Beginning in 1977, Cinematronics had developed and marketed quite a number of arcade games. Their games never entered the public consciousness in the way of an Asteroids or Pac-Man, but they did well enough, and are fondly remembered by arcade aficionados today. By 1982, however, the hits had stopped coming and the company’s vector-graphics technology had begun to look increasingly dated. Overextended and poorly managed like so many companies in this young and crazy industry, they ended up in bankruptcy, needing a hit game to convince the court not to liquidate them entirely. They found what looked like their best shot at such a thing in an unlikely place: in Dyer, who proposed adopting his interactive children’s storybook into an arcade experience. With little else on the horizon, they decided to roll the dice on Dyer’s scheme. Inside the arcade machine’s cabinet would be a very simple computer board, built around the tried-and-true Z80 processor, connected to a Pioneer laser-disc player. Dyer had found 5000 industrial-grade models of the latter languishing in a warehouse in Los Angeles, victims of the somewhat underwhelming reception of the laser disc in general. Pioneer was willing to sell them cheap — a critical consideration for a shoestring operation like this one.

With the hardware side in place, Dyer now needed someone to make the video footage his software would control. He had, in other words, come to the crux of the problem with computer-controlled video. Dyer, however, had an advantage: he lived on the doorstep of Hollywood. He was able to find just the person he needed in Don Bluth.

Bluth was a skilled animator who must have felt he had been born at the wrong time. He got his dream job at Disney in 1971, arriving just in time for the era that would go down as the nadir of Disney’s long history in animated film. Walt Disney’s Nine Old Men were now indeed getting old, and Walt himself was gone, leaving the studio without a strong leader. The result was almost two decades of critical and commercial underachievers, the sole exception being The Rescuers (1977), for which the eight remaining Old Men roused themselves to recapture the old spirit one last time. Bluth also got to work on that film, but otherwise his assignments were disheartening. Yet his options outside Disney were also limited at best. In this era the Saturday-morning cartoons were king. Bluth, a classicist by training and temperament, loathed the make’em-quick-and-cheap-and-sell-the-toys ethos of that world. (“There are two kinds of animation: the Bambi and Pinocchio classical style, and the Saturday-morning-cartoon type. I’d rather sell shoes than do the latter.”) When he left Disney at last in 1979 it was to form his own studio, Don Bluth Productions, to make the kind of big, lush animated features that didn’t seem to interest Disney anymore.

Most would say he delivered with 1982′s The Secret of NIMH, the first film that was fully his. But while the critics raved the public stayed away. Bluth blamed the film’s failure on his distributors MGM/UA, who failed to get it into enough theaters and promoted it only halfheartedly; MGM/UA would probably say that an old-fashioned, animated feature like NIMH was simply passé in the year of E.T., Star Trek II, and Tron. To add to Bluth’s woes, a major strike hit the animation industry just as he was hoping to begin production on a second film. He managed to cut a private deal with the union, but as he did so his financial backers lost faith and pulled their support for the new movie. With no obvious reason to continue to exist, Don Bluth Productions, like Cinematronics, faced bankruptcy and liquidation. And then, like Cinematronics, they got a call from Rick Dyer.

With little money of his own and with his partner literally bankrupt, Dyer couldn’t offer Bluth much beyond a one-third stake in whatever money the doubtful venture might eventually earn. Still, Bluth jumped on the proposal as “a dinghy to a sinking ship.” He scraped together a $300,000 loan, enough to prepare about five minutes of footage for a prototype system that the three partners, who now called themselves Starcom, could show to potential investors. The windfall came when Coleco, a Johnny-come-lately suddenly pushing hard to build a presence in home videogames, offered a cool $1 million for the right to make a home version of the game. Starcom wasn’t quite sure how Coleco was going to manage that, but they were thrilled to take their money. It was enough to let Bluth and company finish the 22 minutes of animated footage found in the final game, albeit barely; with no money to hire voice actors, for instance, the animators and their colleagues around the office simply did the voices themselves. (Editor Dan Molina, who voiced hapless hero Dirk the Daring, seems to have been channeling Scooby-Doo…)

The decision to make The Fantasy Machine into an arcade game necessitated a radical retooling of Dyer’s original vision. Such a slow-paced exercise in interactive storytelling was obviously not going to fit in the frenetic world of the arcade, where owners expected games that were over in a few minutes and ready to accept the next quarter. The Fantasy Machine therefore became Dragon’s Lair, with the story stripped down to the very basics. You guide Dirk the Daring, a courageous but awkward hero in the tradition of Wart from The Sword in the Stone. Dirk loves Daphne, a shapely but empty-headed feminist’s nightmare modeled from old Playboy centerfolds. (Bluth: “Daphne’s elevator didn’t go all the way to the top floor, but she served a purpose.”) Daphne has been kidnapped by the evil wizard Mordorc and his pet dragon — horrid pun coming! — Singe. The game comes down to escaping all of the monsters, traps, and other obstacles in Mordorc’s castle until you arrive in the inner sanctum at last for the final showdown.

Dragon's Lair

Menus asking what to do next were replaced by action sequences which require you to make the right movement with the joystick or hit the fire button to strike with the hero’s sword at the right instant as the video plays; failure means the loss of one of your three lives. At first the team tried to preserve some semblance of you actually guiding the story by placing, say, several doors in a room, each leading to a different scene. In time, however, even that fell away, as all meaningful choices were replaced by what the development team called a “threat/resolve” model. The game as released plays its 30 or so scenes in a randomized order to keep you from getting bored — or too comfortable, thus extending your time at the machine. In each, complete success or complete failure at executing the necessary arbitrary movements in the proper time windows are the only options. You either survive, in which case that scene is checked off your to-do list, or you die, in which case you lose a life, one of the silly death animations which make up a huge chunk of the total content on the laser disc plays, and the scene is shuffled back into the deck.

Let’s take a look at one of these scenes in action. The clip below shows one of the longest scenes in the game, running almost a full minute; many others are over in a scant ten seconds or less. After you (unavoidably) fall into the conveniently placed boat, you have to make thirteen movements with the joystick at the right instants. If you flub any one of these, the scene is immediately interrupted for a separate death sequence.

Lengthy as it is, this is actually one of the easier scenes in the game. The flashes in the oncoming tunnels give some clear visual indication of what you need to do, and most of the necessary actions are fairly intuitive. You may only need to die three or four times here to get the sequence straight. Most scenes are not so forgiving. It’s never obvious just when you should be trying to control Dirk and when you should just be watching; nor is it always clear which move is the correct one, or just when it needs to be executed. You can learn only through trial and error. Back in the day, you were paying 50¢ for every three lives whilst doing so; as a technological showcase the game was priced at twice the normal going rate. If we define a good game as one that gives you lots of interesting choices, Dragon’s Lair must be the worst game ever. As John Cawley noted in his book about Don Bluth, it’s more of a maze than a game; the smartest players were those who just watched other people play for hours while noting the correct moves, to be used to hopefully run through the whole thing in one go when they finally felt ready. I like the description at Dave’s Arcade best: “[Dragon's Lair] is a hybrid of an animated movie and a Choose Your Own Adventure book…except the book is ripped from your hands and thrown across the room every time you fail to turn the page fast enough.” And yet, punishing as the game is while you learn the moves, it becomes trivial once you’ve accomplished that; within weeks of its release every arcade in America had that one annoying kid who had mastered it and used his skill to extend his time at the machine while frustrated arcade owners knashed their teeth. Thus the game manages the neat trick of being too difficult and too easy at the same time.

None of which prevented it from turning into an absolute sensation when it arrived in arcades in the summer of 1983, and not only amongst the usual arcade rats. Thanks in some degree to Don Bluth, whose background as a traditional animator seemed to somehow legitimize Dragon’s Lair in their eyes, the mainstream media and the Hollywood establishment took to it with gusto. It got a feature spot on Entertainment Tonight, feature articles in The Hollywood Reporter and Daily Variety, front-page coverage in a hundred newspapers. Ricky on Silver Spoons got his own personal Dragon’s Lair machine to play on as part of the show’s fantasy of living the good life, teenage style. The New York Post called it “a quantum jump into a whole new art form of the arcade.” Many people who had resisted the lure of the arcades during the days of Space Invaders and Pac-Man now came in at last to have a look and give it a go. Cinematronics couldn’t make enough machines to meet demand. Those arcades that managed to secure one sometimes had to snake velvet ropes around the premises for the line of people waiting to play. Some wired up additional monitors and mounted them high above the machine so the people in line could watch the current player’s exploits. One allegedly installed bleachers for the pure spectators. Most machines earned back their $4000 purchase price within a week or so, while also boosting earnings from all of the other, older machines around them that people played when they got bored of waiting for Dragon’s Lair.

The craze isn’t difficult to understand. Cursory observation — about all the average non-gaming beat reporter was likely to give it — can make it seem that the player is really controlling Dirk, really guiding him through a lushly animated, interactive cartoon. Seen in this light, and when compared to the flickering sprites and electronic bleeting of the other machines in the arcade, Dragon’s Lair could seem like an artifact beamed in from twenty years in the future. The audiovisual leap from old to new was so extreme as to be almost unfathomable, making Rick Dyer and Don Bluth look like technical sorcerers with access to secrets denied to the rest of the world. It felt like movies would have had they leaped from The Jazz Singer to Star Wars in a year.

Pundits within the industry, meanwhile, had their own strong motivations to see Dragon’s Lair and the “laser-disc revolution” it allegedly harbinged in the best possible light. What had begun as a worrisome lack of continued growth in the arcade and home-game-console industries during the second half of 1982 had by that summer of 1983 become a clear, undeniable downturn that was looking more and more like it was about to become a free fall. It appeared that all of those who had snorted dismissively about the videogame “fad” might just have been right. And so, just as Don Bluth saw Dragon’s Lair as the dinghy that could save his company and his career in animation, arcade owners and game makers saw Dragon’s Lair as the dinghy that could save their industry. And for a while that really did seem possible. While the bottom dropped out of the home videogame market, Dragon’s Lair kept the arcades above water. (One arcade owner made a comment about Dragon’s Lair‘s popularity that could be read as ominous as easily as ecstatic: “There is no number two. It’s just taken over.”) People in the industry convinced themselves that 1984 would bring a wave of other, even better laser-disc games and the high times would well and truly be here again. John Cook, writer for an industry magazine, gushed that “by this time next year a new videogame without a laser-disc player will be as rare as a silent movie in 1929.”

M.A.C.H. 3 and Bega's Battle, two of the short-lived laser-disc games of 1984

M.A.C.H. 3 and Bega’s Battle, two of the short-lived laser-disc games of 1984

In reality the second half of 1983, when Dragon’s Lair stood alone, was as good as it got for laser-disc games. Cook’s predicted avalanche of new games did hit with the new year, but they were uniformly uninspiring. They fell into two general categories: those that aped Dragon’s Lair‘s “interactive cartoon” approach with all of its associated limitations and those that used laser-disc video strictly as eye candy, displaying it behind and between levels of a more conventional game. In addition to their lack of depth, virtually all of these games also lacked the one saving grace of Dragon’s Lair: the skilled animators at Don Bluth Productions. Some of them did the best they could with the artists they could find; many grabbed their footage from cartoons or even feature films (Astron Belt, a game which actually predates Dragon’s Lair in its original Japanese release, used footage from the recent Star Trek II amongst other sources); all of them looked shabby in comparison to Bluth’s work. None did very well, and the industry as a whole settled back into the decline that Dragon’s Lair had briefly arrested.

Space Ace

Even Dyer and Bluth’s followup to Dragon’s Lair, Space Ace, despite having a more coherent, linear plot progression and giving the player at least a modicum of more control over its direction, failed to recapture the old magic. Players of Dragon’s Lair had fallen into two groups: the casually curious, who lost a couple of dollars before they even figured out what was happening on the screen or what they were supposed to be doing and moved on with a shrug; and the committed, who doggedly worked out the moves and battled their way to the end. With the novelty of the cartoon graphics now gone, neither group showed much interest in repeating the experience. As for the arcade industry: it would eventually stabilize and even recover somewhat, but those heady days circa 1981 would never return.

Even at its peak Dragon’s Lair never quite paid off for the folks who made it the way the hype might have made you think it did. Their lack of financial resources and the bankruptcy courts who had to approve Cinematronics’s every move kept them from fully capitalizing on the early publicity. They eventually ran out of the surplus, discontinued laser-disc players that Dyer had found, and had a terrible time getting new ones out of Pioneer. Cinematronics did manage to produce over 10,000 units over Dragon’s Lair‘s brief production run, a very impressive figure in a slumping arcade industry, but could probably have sold several times that if they could only have made them while the craze lasted. On the other hand, the game’s scarcity doubtlessly added to its mystique, and allowed Cinematronics to sell each unit for $4000, easily twice the industry’s going rate. Less ambiguously damaging were the technical faults that started to crop up after a few months. Dragon’s Lair worked its laser-disc player hard, sending the laser careening all over the disc for ten or twelve hours per day of constant use. Meanwhile the machine that housed it was getting constantly kicked, slapped, and jostled by angry or jubilant players (more of the former, one suspects, given the nature of the game). Pioneer had never planned for such conditions. The players started to fail in relatively short order, leaving Cinematronics scrambling to replace them, at considerable expense in money and in the precious new laser-disc players they had to use as replacements, for angry arcade owners who had just lost their cash cow.

The partners, like the industry as a whole, mistook player infatuation for commitment. Space Ace, which cost twice as much as Dragon’s Lair to make, did a bare fraction of the business. Development of a third game, Dragon’s Lair II, was halted in March of 1984. It was hoped that this would just be a temporary delay, to let the laser-disc scene shake itself out a bit and the substandard Dragon’s Lair knock-offs fade away. But by July Cinematronics couldn’t sell the Dragon’s Lair and Space Ace games that were now clogging their warehouse. Production had finally ramped up just in time for demand to cease. The world had moved on; Dragon’s Lair II was cancelled. The people who had planned to make it had no choice but to move on as well, although not without accusations and threats amongst the partners as everyone blamed everyone else for what had happened.

Cinematronics straggled on in the diminished arcade industry for more years than anyone might have expected before finally being acquired by another arcade survivor: WMS Industries, the company that had once been Williams Electronics of Defender fame.

Rick Dyer renamed his company RDI Video Systems to continue to pursue his original dream of The Fantasy Machine. He put together a laser-disc entertainment system for the home called Halcyon, or just Hal for short, a deliberate play on the computer HAL from 2001: A Space Odyssey; apparently he judged that people thinking of inviting Hal into their living rooms wouldn’t think too much about HAL’s running rather messily amok in the film.

Halcyon Game System

Hal talked to you, and, if he was in a good mood, accepted a limited number of voice commands back in return. This feature was enough evidence for RDI to declare that he was “artificially intelligent,” again without seeming to think about where HAL’s AI got the poor Discovery crew in the movie. Dyer and one of his partners appeared with Hal on Computer Chronicles, giving what has to be one of the most uncomfortable product demonstrations ever. Hal refuses to understand host Stewart Cheifet when he says the simple word “one,” to the point that Cheifet finally just gives up and takes option two instead. Meanwhile co-host Gary Kildall, no slouch in matters of computer science, presses Dyer and his associate relentlessly to abandon their patently silly AI claim; they just cling to it all the tighter.

Dyer hoped to release a whole line of interactive laser discs for Halcyon, but only three were ever completed: a couple of football games that use real NFL footage, and Thayer’s Quest, a menu-driven interactive story that hews very close to Dyer’s original plans for the game that became Dragon’s Lair.

Thayer's Quest

Halcyon as a whole is an amazing, bizarre, visionary, kooky creation years ahead of its practical time. As the coupe de grâce, RDI planned to sell it for a staggering $2200. It’s unclear whether any were actually sold on the open market before Dyer’s investors pulled the plug; if so, the numbers were truly miniscule. After Halcyon’s failure Dyer continued intermittently to work with interactive narratives, surfacing again in the mid-1990s with two adventure games, Kingdom: The Far Reaches and Kingdom II: Shadoan.

Don Bluth never had any real passion for videogames; it’s unfortunate that Dragon’s Lair has gone down in history as a Don Bluth creation, when in reality it was very much Rick Dyer’s vision. Even at the height of the game’s success Bluth always talked about it as a means to an end, a way to expose the arcade generation to the pleasures of classical animation rather than as a new type of entertainment in its own right. Short-lived as its success was, Dragon’s Lair served its purpose for Bluth. It did indeed become the dinghy that kept him afloat in the world of commercial animation until the opportunity to do another feature came along. Bluth found a backer in Steven Spielberg, whose Amblin Entertainment funded and released Bluth’s An American Tail in 1986. That film, along with the likes of The Brave Little Toaster and Who Framed Roger Rabbit, marked the beginning of a renaissance for animation on the big screen, paving the way for Pixar and a rejuvenated Disney to return the big-budget animated feature to the yearly blockbuster rolls in the 1990s. But another, more direct legacy of Dragon’s Lair probably didn’t thrill Bluth quite as much: the game was adapted into exactly the kind of knock-off Saturday-morning cartoon he loathed. Unfortunately for ABC, it debuted only in the fall of 1984, by which time the kids they were trying to reach had moved on long ago. It lasted for only one season of 13 episodes.

Coleco also saw little return for their investment in the Dragon’s Lair intellectual property. They had schemed on introducing a laser-disc player for their ColecoVision console and/or their ill-fated Adam home computer, but soon realized — shades of the $2200 Halcyon — it would just be too expensive to be practical. Instead they funded a completely new game for the Adam inspired by scenes from the original. It didn’t look as nice, but was probably more fun in the long run. That game turned out to be just the first — and arguably one of the best — of a long, confusing stream of games that have carried the Dragon’s Lair name since. When Readysoft released a version for the Amiga in late 1988 it was rightly seen as a landmark. As the first version that looked reasonably close to the laser-disc original, it marked just how far computer graphics had come in five years; soon we would be in the era of Pixar, when computers are used to create feature cartoons. But not all things change — the gameplay remained as simplistic as ever. Today Digital Leisure sells Dragon’s Lair, Space Ace, and Dragon’s Lair II, completed at last, in versions playable on anything from your Blu-Ray player to your iPhone. And yes, it’s still the same exercise in rote memorization it’s always been, with a few optional kindnesses to make the experience a bit less painful. Dragon’s Lair must be the most long-lived bad game in the history of the industry. Such is the power of nostalgia.

Dragon’s Lair makes an interesting study today not just as an historical curiosity or an example of style over substance, although it is both of those things. In addition to being one more crazy, unexpected offshoot of the original Adventure, that urtext of an industry, it’s an important early way station in gaming’s long relationship with movies; indeed, I believe it’s the first game to give itself the fraught title of “interactive movie.” The lesson that may seem obvious after playing Dragon’s Lair a few times is one that the industry would learn only slowly and painfully: non-interactive video is a problematic fit with an interactive medium, a subject we’ll undoubtedly explore in depth around here if we ever make it to the era of the lost and lamented (?) full-motion video games of the 1990s.

But for now let’s not judge Dragon’s Lair too harshly. It may not be much of a game, but, like so much of what I write about on this blog, it’s a great example of stretching available technology just as far as it will go and creating something kind of amazing in its time and place. For that golden six months in 1983, at least, that was more than enough. The impression it made on hearts and minds in that short span of time has fueled thirty years of nostalgia. Not bad for a 22-minute cartoon.

(As mentioned in the article, you can still buy various incarnations of Dragon’s Lair and associated games from Digital Leisure. You can also use some of these products as a key to let you play the games in their original form using the Daphne emulator. See that project’s website for more information.

Primary sources used for this article included articles in the January 1982 Creative Computing, the November 1983, January 1984, and January 1985 Electronic Games, and the April 1984 Enter. Online sources included The Dot Eaters and The Dragon’s Lair Project. Finally, John Cawley’s book The Animated Films of Don Bluth was indispensable.)


Tags: , ,