Like its American counterpart, the British PC industry was untenably fragmented by the beginning of 1983. The previous year had been deemed Information Technology Year by the government. Unlike so many government initiatives, this one had succeeded swimmingly in its goal of drumming up excitement and enthusiasm amongst the public for microcomputers. Where excitement and enthusiasm go in a market economy, of course, also go products. Thus the new computers had come thick and fast throughout 1982. In addition to the BBC Micro and the Sinclair Spectrum which I’ve already written about, there were heaps of other machines whose names sound like something spewed by a beta version of Google Translate: the Dragon 32, the Grundy NewBrain, the Jupiter Ace, the Camputers Lynx, the Oric-1. Throw in a spate of knockoffs and clones from the Far East, and the situation was truly chaos; most of these machines were incompatible with one another. Something had to give. If the lines of battle had been drawn up in 1982, the war would begin in earnest in 1983, just as it had in North America.
Even if you aren’t that familiar with British computing history, you probably aren’t exactly in suspense about who won in the British theater of the Home Computer Wars of 1983. The fact that I chose to feature the BBC Micro and the Sinclair Spectrum on this blog in preference to all those other oddball models pretty much says it all. The BBC Micro found a home in virtually every school in Britain, and, even at a street price of £400 or so, also became a favorite of researchers and hardcore hobbyists, who loved its sturdy construction and expandability. The Spectrum had none of these things going for it, but it did have a £130 price tag and a bright color graphics display for games. Neither machine was perfect, but each was a good fit for its niche. And in addition to hardware specifications both had considerable soft power working in their favor. The BBC Micro had been blessed with the imprimatur of the British government, and thus stood in effect as the official computer of the British nation. And the Speccy came from Uncle Clive, the man who had first brought low-cost computing to the British masses. Sure, he was a bit eccentric and a bit prickly, but that was just a manifestation of his impatience with the bureaucrats and business concerns that delayed his inventions reaching the masses. It was an image that Sinclair, who had begun to read his own positive press notices when said notices existed only in his head, positively reveled in. Throughout the year Sinclair struggled to keep up with demand, as seemingly every kid in Britain begged their parents for a Speccy. Meanwhile those other computers straggled on as best they could before bowing to the inevitable one by one during this year and the next. Kids wanted what their friends had, and their friends all had Speccys.
Put crudely, then, the BBC Micro came to occupy the space in British computing held by the Apple II in North America, the Establishment choice for education and home computing. The Speccy, meanwhile, was the Commodore 64, the cheaper, ruder, funner model that the kids adored. Just to keep us from getting too neat with our analogies, it should be noted that the Commodore 64 itself also began arriving in numbers in Britain during 1983. However, the vagaries of economics and exchange rates beings what they were, its initial price there was closer to that of the BBC Micro than the Spectrum, limiting its sales. The Commodore 64 became the computer for the posh public-school kids, while the Speccy remained the choice of the masses. The former was unquestionably a much more capable machine than the latter by any objective measure, but even in later years, when the price dropped and the 64′s popularity grew, it never quite got the same sort of love that accrued to the Spectrum. Like driving on the wrong side of the road and eating baked beans for breakfast, there was just something indelibly British about the Speccy’s peculiar BASIC and weird keyboard, something that made a generation of British gamers and game programmers fall in love with it as their machine.
To work this comparison one last time, Clive Sinclair’s 1983 in Britain was like Jack Tramiel’s in North America — the best, most unblemished, most triumphant year of a long, chequered career. It must have felt like vindication itself when he received an invitation to attend the Queen’s Birthday Honors in June to receive a knighthood. Suddenly Uncle Clive had become Sir Clive. Given Sinclair’s relationship with the British bureaucracy the honor might have seemed a surprising one. Indeed, at the time that he received it his company was still embroiled in various government investigations for its failure to ship its products in a timely fashion to customers as well as a rash of complaints about shoddy workmanship. (Sinclair was still desperately trying to recall some 28,000 Spectrum power packs that had the potential to shock a person into unconsciousness — shades of the exploding watches of yore.) Luckily, he was a huge favorite of Margaret Thatcher, no friend of entrenched bureaucratic forces herself, who saw him as exactly the kind of entrepreneur that her new, more freedom-loving and capitalism-friendly Britain needed. And Thatcher, who was riding a tide of personal popularity and renewed patriotism of her own in the wake of the Falklands War, generally got what she wanted. The press gushed with praise in the wake of Sinclair’s honor, some justified, some somewhat, shall we say, overblown. Popular Computing Weekly credited him with “transforming Britain from a nation of shopkeepers to a nation of micro users.” Sinclair User simply announced that he had “invented the home micro,” conveniently forgetting about some folks on the other side of the Atlantic.
Clive still being Clive regardless of his honorific, he sunk his cash and his reputation into projects that were of debatable wisdom at best. In lieu of a floppy-disk drive for the Spectrum, he invested in a strange piece of technology called the Microdrive, a tape-based system that looked and operated rather like an old 8-track audio tape. Announced simultaneously with the Spectrum itself back in April of 1982, the Microdrive didn’t finally arrive until the summer of 1983. When it did it was like a caricature of a Sinclair product: cheaper than the competition but also slow and balky and horribly unreliable. A computer crash at the wrong moment could erase an entire tape in seconds. Users may have partially embraced the Speccy because of its eccentricities, but this was taking things too far. Rather than being charming the Microdrive was just sort of terrifying. It never got much love from Speccy users, who chose to stick with the even slower but more trustworthy medium of the cassette tape. In his choosing to develop such a white elephant rather than investing in the plebian, well-proven technology of the floppy disk we see the most exasperating side of Clive Sinclair, who was always trying to prove how much more clever he was than the conventional wisdom of his competitors, even though conventional wisdom is often conventional for a reason. The Microdrive in turn shows the dangers of a company that is absolutely controlled by a single mercurial individualist. Sinclair’s backers and fans would learn much more about that in the time to come.
Then again, at least the Microdrive was a computer product. Sir Clive, who always harbored a deep skepticism about how long this computer thing was really going to last, also sunk energy and resources into his twin white whales, a miniature, portable television set and an electric car. Both projects would provoke much hilarity in the British press in later years when his star as a captain of industry had faded. But instead of going on any more about any of that today let’s just leave Sir Clive to enjoy his big year. His road will get bumpier soon enough.
Criticisms aside, Sinclair did play a huge role in turning Britain into the most computer-mad nation on Earth. Despite the American industry’s considerable head start, a greater percentage of British than American homes had computers by the end of 1983. Already by April total British microcomputer sales had passed the one-million mark. By December the Speccy alone was flirting with that figure.
All those computers in private hands meant a software marketplace that was if anything growing even faster than the hardware side. And since the computers selling in biggest numbers were the Speccys being installed in bedrooms and livings rooms across Britain, software in this context meant mostly games. By 1983 a hit game could make you, at least for the time being, rich, as was demonstrated by a flood of brash young game publishers populated by brash young men just a year or two (at most) removed from bicycling to school. Now they drove Porsches and Ferraris to posh offices in the most fashionable parts of town. A company called Imagine Software, publishers of such Speccy hits as Arcadia, was amongst the most spectacular of the success stories. When a BBC film crew visited their office for a documentary feature in early 1984 they found “huge, luxurious offices, acres of carpet, computer terminals by the ton load, lots of young programmers, secretaries in abundance, young ‘gophers’ acting as runners for the management, and a company garage packed with a fleet of Ferrari Boxers, BMWs for the lesser executives, and the famous Mark Butler custom hand-built Harris motorbike.” Clearly a certain sector of British society had another very good reason to love Sir Clive: his creation was making them rich.
Just as in America, established media forces were also eager to get a piece of the action. Virgin Records launched Virgin Games, and of all people K-tel, those purveyors of cheesy TV-peddled hits compilations, also jumped in, attending the Midland Computer Fair with a well-publicized £1 million burning a hole in their pockets for deal-making with eager programmers.
Yet even with the arrival of Big Money on the scene the British industry remained wilder, woolier, and more democratic than its American counterpart. The games themselves remained much less expensive. Whereas a big release like Ultima III could exceed $50 in America, games in Britain virtually never exceeded £10, and most sold for around £5 or even less. With less invested in any particular title, both publishers and buyers were more willing to take chances on crazy ideas, and individual programmers had a better chance of seeing their creations on store shelves and actually making them some money. Even if no one wanted to give them a chance they could just start their only little company in the hope of becoming the next Imagine; distribution was also comparatively wide open, making it relatively easy to get your game to the public on your own. It all added up to a market that had a lot of product that for very good reasons would never have passed muster in the United States. Yet it also had a spirit of wild-eyed, devil-may-care creativity about it that was sometimes lacking amongst the more staid, polished American publishers.
My special interest, adventure games, were a big part of the industry, amongst if not the most popular genre out there. As with other kinds of games, adventure seemed to be multiplying exponentially from month to month. Britain was not just computer mad but also adventure mad. Well before the end of the year production of new British adventure games far outstripped that of American, and the disparity would only continue to grow over the next few years. In November Micro-Adventurer debuted, the first magazine anywhere in the world dedicated not just to games in general but to this particular genre.
To survey this explosion of titles in any real depth would bog us down for months; that will have to remain a task for some other, even more esoteric blog than this one. But I will try to convey some sense of the times by continuing to follow the careers of some friends we met earlier. We’ll do that next time.
(This survey of the scene is drawn mainly from the Your Computer and Sinclair User issues of 1983, with occasional forays into Home Computing Weekly and Popular Computing Weekly. The image is taken from the 1984 Sinclair User annual’s cover.)
Just before we left the U.S. for Denmark almost four years ago, I bought a cache of old Amiga magazines on eBay to help with the book I was planning to write there. By the time we moved to Norway almost two years ago, the book was finished, and so I deposited the magazines at my in-laws’ house near Flensburg, Germany. (There are a lot of different countries in my life these days.) Now said in-laws are hoping to move soon, so I need to do something with them. I just don’t have the space to keep them, especially as we’re likely to be moving yet again quite soon. Nor do I need them anymore in hard-copy form, because they’re all now archived on my computer. So, I’m wondering if anyone wants them.
What they are, specifically, is almost every issue of AmigaWorld and Amazing Computing from the first issue until just shortly after Commodore’s 1994 bankruptcy, in (relatively) gently used condition. I say “almost every” for the sake of caution, as there may be just one or two issues of either or both magazines missing, but no more than that. There’s also some sales catalogs, a few issues of an early desktop-video magazine, and some other loose bits.
So, I’m willing to give the whole collection to anyone who can arrange for their transportation. That, alas, could be the sticking point. They fill at least eight or ten boxes (maybe more), and they aren’t light. Shipping would likely be expensive within Europe, very expensive internationally. The ideal scenario would be someone willing to pick them up in Flensburg. I’ll be there this Friday and Saturday only, but I could arrange for one of my in-laws to be there most days in the near future.
If you want them, send me an email using the link at the right and tell me how you propose to get them. Should there be a rush I’ll decide based on some combination of first come first serve and practicality. And if you know anyone else who might be interested, please tell them about it. If I can’t find them a home I’ll probably have to take the magazines to the recycler, much as that would pain me. Hundreds of magazines just aren’t compatible with the Traveling Scandinavian Roadshow that is our lives at the moment.
In my last article I described some of the pioneering early work done in multimedia computing with the aid of the new technology of the laser disc. These folks were not the only ones excited by the laser disc’s potential. Plenty of others, at least some of them of a somewhat more, shall we say, mercenary disposition, considered how they could package these advancements into a form practical and inexpensive enough for home users. They dreamed of a new era of interactive video entertainment to supplement or even replace the old linear storytelling of traditional television. Tim Onosko described the dream in an article in the January 1982 issue of Creative Computing that reads like a scene from L.A. Noire:
The scene is your living room. You’re watching a television program — let’s say it’s a cop show. A policeman is questioning a man suspected of committing a crime. The suspect answers in a barely audible tone, and his words come slowly. The policeman finishes his interrogation, then turns to the camera and asks you a question: should we believe him?
On a hand-held remote control, you press a button indicating that you doubt the suspect’s story. The cop consults you again, this time offering three possibilities.
Do you think the suspect was:
B) concealing important facts?
C) in shock and unable to communicate accurately?
The core of this idea was decades old even in 1982. Before she became the voice of Objectivism, Ayn Rand first attracted literary notice for her 1934 play Night of January 16th, a courtroom drama with a twist: members of the audience get to play the jury, deciding whether the defendant is guilty or not guilty. The ending of the play naturally depends on their choice. In 1967 the Czech director Radúz Činčera debuted his film Kinoautomat, in which the audience gets to vote on what happens next — and thus which scene is played next — nine times over the course of a viewing. On a less high-brow note, various cable operators during the 1970s experimented with what might be seen as the ancestors of American Idol, allowing the audience to vote their preferences via telephone. The main thing that distinguishes the scheme that Onosko describes above is that it places the individual, rather than a voting group, in control. In the dawning age of personal computing, that was no small distinction.
Still, interactive-video visionaries faced an uphill climb made even steeper by the very people who championed the laser disc as the next big thing in traditional home video. The legal team at MCA, co-developer of the laser disc, declared that their contracts with the Screen Actors Guild made it illegal to offer any sort of interactive features on their movie discs; they could only sell movies to be “viewed straight through.” Pioneer, the electronics brand with by far the biggest commercial presence in laser discs, didn’t even bother with excuses. They were simply uninterested in the various proposals from interested developers, not even deigning to reply to most. These big companies insisted on seeing the laser disc as a videocassette with better picture and sound, a difficult sale to make in light of the format’s other, very real disadvantages to the videocassette. Meanwhile its really revolutionary qualities, while not quite going unnoticed — Pioneer and others did make and sell industrial-grade players and the laser discs they played to the institutional projects I described in my last article as well as many more — were deemed of little ultimate significance in the consumer market.
Denied the industry sanction that might have made of the interactive laser disc a real force in consumer electronics, hobbyists and small developers did the best they could. A tiny company called Aurora Systems developed and sold an interface between an Apple II and the most popular of the early consumer-grade laser-disc players, the Pioneer VP-1000. Even so, those dreaming of a hobbyist-driven market for interactive laser-disc entertainment akin to that of the early general software market would be disappointed. A fundamental problem prevented it: even hobbyists with the equipment and the skill to shoot video sequences for their productions had no way to get it onto the read-only medium of the laser disc.
Admittedly, some went to great lengths to try to get around this. In that same January 1982 issue, which was given over almost entirely to the potential of the interactive laser disc and multimedia computing, Creative Computing published a fascinating experiment as a type-in program. Rollercoaster is a text adventure that requires not only an Apple II, a VP-1000, and the Aurora interface but also a laser disc of the 1977 movie of the same name. You, like George Segal in the film, are tasked with trying to stop a madman from blowing up a roller coaster. The game opens by playing an intro sequence from the laser disc interspersed with text. You see the madman planting the bomb, and see the airplane carrying you, a detective, arriving on the scene. Most of the locations you enter once the game proper begins are illustrated by a single frame judiciously chosen from the movie, and various actions are rewarded with a snippet of footage.
Flow chart for the game Rollercoaster, showing where video sequences and still frames appear
Rollercoaster, written by David Lubar with major design contributions from Creative Computing‘s publisher and editor David Ahl, is almost certainly the first computer game to incorporate what would come to be called the cut scene. It’s also the first to incorporate video footage from the real world, what would come to be given the tag of “full-motion video,” harbinger of a major if relatively short-lived game-industry craze of the 1990s. Still, its piggybacking on the film of another was legally problematic at best, and obviously inapplicable to a boxed-commercial-software industry. The fundamental blockage –that of lacking the resources to make original laser-disc content — remained. And then along came a Southern Californian named Rick Dyer, who made the deal that would bring interactive video to the masses.
Like so many others, Dyer had found his first encounter with Crowther and Woods’s Adventure a life-changing experience. Even as he built a lucrative career as a videogame programmer for Mattel and Coleco in the late 1970s and early 1980s, he dreamed of doing an adventure game in multimedia. Dissatisfied with the computer graphics available to him, he cast about far and wide for an alternative that would be more aesthetically pleasing. His first attempt was a sort of automated version of the Choose Your Own Adventure books that would soon be huge in children’s publishing. It consisted of a roll of tape upon which was printed text and pictures. As the player made choices using a keyboard, the controlling computer would shuttle the tape back and forth to expose the correct “page” for reading and viewing. Next he created a setup built around a computer-controlled slide projector, with a computer-controlled tape player used to play snippets of audio to accompany each slide. He also tried a complicated VCR setup, in which a videotape was laboriously rewound and fast-forwarded to find the next scenes needed by the game. When laser-disc players began to arrive in numbers, he felt he had the correct format at last. He started a company of his own, Advanced Micro Systems, and went to various toy companies with an idea he called The Fantasy Machine, a sort of interactive storybook for children. He found no takers. But then he met two partners just desperate enough to listen to his ideas.
Beginning in 1977, Cinematronics had developed and marketed quite a number of arcade games. Their games never entered the public consciousness in the way of an Asteroids or Pac-Man, but they did well enough, and are fondly remembered by arcade aficionados today. By 1982, however, the hits had stopped coming and the company’s vector-graphics technology had begun to look increasingly dated. Overextended and poorly managed like so many companies in this young and crazy industry, they ended up in bankruptcy, needing a hit game to convince the court not to liquidate them entirely. They found what looked like their best shot at such a thing in an unlikely place: in Dyer, who proposed adopting his interactive children’s storybook into an arcade experience. With little else on the horizon, they decided to roll the dice on Dyer’s scheme. Inside the arcade machine’s cabinet would be a very simple computer board, built around the tried-and-true Z80 processor, connected to a Pioneer laser-disc player. Dyer had found 5000 industrial-grade models of the latter languishing in a warehouse in Los Angeles, victims of the somewhat underwhelming reception of the laser disc in general. Pioneer was willing to sell them cheap — a critical consideration for a shoestring operation like this one.
With the hardware side in place, Dyer now needed someone to make the video footage his software would control. He had, in other words, come to the crux of the problem with computer-controlled video. Dyer, however, had an advantage: he lived on the doorstep of Hollywood. He was able to find just the person he needed in Don Bluth.
Bluth was a skilled animator who must have felt he had been born at the wrong time. He got his dream job at Disney in 1971, arriving just in time for the era that would go down as the nadir of Disney’s long history in animated film. Walt Disney’s Nine Old Men were now indeed getting old, and Walt himself was gone, leaving the studio without a strong leader. The result was almost two decades of critical and commercial underachievers, the sole exception being The Rescuers (1977), for which the eight remaining Old Men roused themselves to recapture the old spirit one last time. Bluth also got to work on that film, but otherwise his assignments were disheartening. Yet his options outside Disney were also limited at best. In this era the Saturday-morning cartoons were king. Bluth, a classicist by training and temperament, loathed the make’em-quick-and-cheap-and-sell-the-toys ethos of that world. (“There are two kinds of animation: the Bambi and Pinocchio classical style, and the Saturday-morning-cartoon type. I’d rather sell shoes than do the latter.”) When he left Disney at last in 1979 it was to form his own studio, Don Bluth Productions, to make the kind of big, lush animated features that didn’t seem to interest Disney anymore.
Most would say he delivered with 1982′s The Secret of NIMH, the first film that was fully his. But while the critics raved the public stayed away. Bluth blamed the film’s failure on his distributors MGM/UA, who failed to get it into enough theaters and promoted it only halfheartedly; MGM/UA would probably say that an old-fashioned, animated feature like NIMH was simply passé in the year of E.T., Star Trek II, and Tron. To add to Bluth’s woes, a major strike hit the animation industry just as he was hoping to begin production on a second film. He managed to cut a private deal with the union, but as he did so his financial backers lost faith and pulled their support for the new movie. With no obvious reason to continue to exist, Don Bluth Productions, like Cinematronics, faced bankruptcy and liquidation. And then, like Cinematronics, they got a call from Rick Dyer.
With little money of his own and with his partner literally bankrupt, Dyer couldn’t offer Bluth much beyond a one-third stake in whatever money the doubtful venture might eventually earn. Still, Bluth jumped on the proposal as “a dinghy to a sinking ship.” He scraped together a $300,000 loan, enough to prepare about five minutes of footage for a prototype system that the three partners, who now called themselves Starcom, could show to potential investors. The windfall came when Coleco, a Johnny-come-lately suddenly pushing hard to build a presence in home videogames, offered a cool $1 million for the right to make a home version of the game. Starcom wasn’t quite sure how Coleco was going to manage that, but they were thrilled to take their money. It was enough to let Bluth and company finish the 22 minutes of animated footage found in the final game, albeit barely; with no money to hire voice actors, for instance, the animators and their colleagues around the office simply did the voices themselves. (Editor Dan Molina, who voiced hapless hero Dirk the Daring, seems to have been channeling Scooby-Doo…)
The decision to make The Fantasy Machine into an arcade game necessitated a radical retooling of Dyer’s original vision. Such a slow-paced exercise in interactive storytelling was obviously not going to fit in the frenetic world of the arcade, where owners expected games that were over in a few minutes and ready to accept the next quarter. The Fantasy Machine therefore became Dragon’s Lair, with the story stripped down to the very basics. You guide Dirk the Daring, a courageous but awkward hero in the tradition of Wart from The Sword in the Stone. Dirk loves Daphne, a shapely but empty-headed feminist’s nightmare modeled from old Playboy centerfolds. (Bluth: “Daphne’s elevator didn’t go all the way to the top floor, but she served a purpose.”) Daphne has been kidnapped by the evil wizard Mordorc and his pet dragon — horrid pun coming! — Singe. The game comes down to escaping all of the monsters, traps, and other obstacles in Mordorc’s castle until you arrive in the inner sanctum at last for the final showdown.
Menus asking what to do next were replaced by action sequences which require you to make the right movement with the joystick or hit the fire button to strike with the hero’s sword at the right instant as the video plays; failure means the loss of one of your three lives. At first the team tried to preserve some semblance of you actually guiding the story by placing, say, several doors in a room, each leading to a different scene. In time, however, even that fell away, as all meaningful choices were replaced by what the development team called a “threat/resolve” model. The game as released plays its 30 or so scenes in a randomized order to keep you from getting bored — or too comfortable, thus extending your time at the machine. In each, complete success or complete failure at executing the necessary arbitrary movements in the proper time windows are the only options. You either survive, in which case that scene is checked off your to-do list, or you die, in which case you lose a life, one of the silly death animations which make up a huge chunk of the total content on the laser disc plays, and the scene is shuffled back into the deck.
Let’s take a look at one of these scenes in action. The clip below shows one of the longest scenes in the game, running almost a full minute; many others are over in a scant ten seconds or less. After you (unavoidably) fall into the conveniently placed boat, you have to make thirteen movements with the joystick at the right instants. If you flub any one of these, the scene is immediately interrupted for a separate death sequence.
Lengthy as it is, this is actually one of the easier scenes in the game. The flashes in the oncoming tunnels give some clear visual indication of what you need to do, and most of the necessary actions are fairly intuitive. You may only need to die three or four times here to get the sequence straight. Most scenes are not so forgiving. It’s never obvious just when you should be trying to control Dirk and when you should just be watching; nor is it always clear which move is the correct one, or just when it needs to be executed. You can learn only through trial and error. Back in the day, you were paying 50¢ for every three lives whilst doing so; as a technological showcase the game was priced at twice the normal going rate. If we define a good game as one that gives you lots of interesting choices, Dragon’s Lair must be the worst game ever. As John Cawley noted in his book about Don Bluth, it’s more of a maze than a game; the smartest players were those who just watched other people play for hours while noting the correct moves, to be used to hopefully run through the whole thing in one go when they finally felt ready. I like the description at Dave’s Arcade best: “[Dragon's Lair] is a hybrid of an animated movie and a Choose Your Own Adventure book…except the book is ripped from your hands and thrown across the room every time you fail to turn the page fast enough.” And yet, punishing as the game is while you learn the moves, it becomes trivial once you’ve accomplished that; within weeks of its release every arcade in America had that one annoying kid who had mastered it and used his skill to extend his time at the machine while frustrated arcade owners knashed their teeth. Thus the game manages the neat trick of being too difficult and too easy at the same time.
None of which prevented it from turning into an absolute sensation when it arrived in arcades in the summer of 1983, and not only amongst the usual arcade rats. Thanks in some degree to Don Bluth, whose background as a traditional animator seemed to somehow legitimize Dragon’s Lair in their eyes, the mainstream media and the Hollywood establishment took to it with gusto. It got a feature spot on Entertainment Tonight, feature articles in The Hollywood Reporter and Daily Variety, front-page coverage in a hundred newspapers. Ricky on Silver Spoons got his own personal Dragon’s Lair machine to play on as part of the show’s fantasy of living the good life, teenage style. The New York Post called it “a quantum jump into a whole new art form of the arcade.” Many people who had resisted the lure of the arcades during the days of Space Invaders and Pac-Man now came in at last to have a look and give it a go. Cinematronics couldn’t make enough machines to meet demand. Those arcades that managed to secure one sometimes had to snake velvet ropes around the premises for the line of people waiting to play. Some wired up additional monitors and mounted them high above the machine so the people in line could watch the current player’s exploits. One allegedly installed bleachers for the pure spectators. Most machines earned back their $4000 purchase price within a week or so, while also boosting earnings from all of the other, older machines around them that people played when they got bored of waiting for Dragon’s Lair.
The craze isn’t difficult to understand. Cursory observation — about all the average non-gaming beat reporter was likely to give it — can make it seem that the player is really controlling Dirk, really guiding him through a lushly animated, interactive cartoon. Seen in this light, and when compared to the flickering sprites and electronic bleeting of the other machines in the arcade, Dragon’s Lair could seem like an artifact beamed in from twenty years in the future. The audiovisual leap from old to new was so extreme as to be almost unfathomable, making Rick Dyer and Don Bluth look like technical sorcerers with access to secrets denied to the rest of the world. It felt like movies would have had they leaped from The Jazz Singer to Star Wars in a year.
Pundits within the industry, meanwhile, had their own strong motivations to see Dragon’s Lair and the “laser-disc revolution” it allegedly harbinged in the best possible light. What had begun as a worrisome lack of continued growth in the arcade and home-game-console industries during the second half of 1982 had by that summer of 1983 become a clear, undeniable downturn that was looking more and more like it was about to become a free fall. It appeared that all of those who had snorted dismissively about the videogame “fad” might just have been right. And so, just as Don Bluth saw Dragon’s Lair as the dinghy that could save his company and his career in animation, arcade owners and game makers saw Dragon’s Lair as the dinghy that could save their industry. And for a while that really did seem possible. While the bottom dropped out of the home videogame market, Dragon’s Lair kept the arcades above water. (One arcade owner made a comment about Dragon’s Lair‘s popularity that could be read as ominous as easily as ecstatic: “There is no number two. It’s just taken over.”) People in the industry convinced themselves that 1984 would bring a wave of other, even better laser-disc games and the high times would well and truly be here again. John Cook, writer for an industry magazine, gushed that “by this time next year a new videogame without a laser-disc player will be as rare as a silent movie in 1929.”
M.A.C.H. 3 and Bega’s Battle, two of the short-lived laser-disc games of 1984
In reality the second half of 1983, when Dragon’s Lair stood alone, was as good as it got for laser-disc games. Cook’s predicted avalanche of new games did hit with the new year, but they were uniformly uninspiring. They fell into two general categories: those that aped Dragon’s Lair‘s “interactive cartoon” approach with all of its associated limitations and those that used laser-disc video strictly as eye candy, displaying it behind and between levels of a more conventional game. In addition to their lack of depth, virtually all of these games also lacked the one saving grace of Dragon’s Lair: the skilled animators at Don Bluth Productions. Some of them did the best they could with the artists they could find; many grabbed their footage from cartoons or even feature films (Astron Belt, a game which actually predates Dragon’s Lair in its original Japanese release, used footage from the recent Star Trek II amongst other sources); all of them looked shabby in comparison to Bluth’s work. None did very well, and the industry as a whole settled back into the decline that Dragon’s Lair had briefly arrested.
Even Dyer and Bluth’s followup to Dragon’s Lair, Space Ace, despite having a more coherent, linear plot progression and giving the player at least a modicum of more control over its direction, failed to recapture the old magic. Players of Dragon’s Lair had fallen into two groups: the casually curious, who lost a couple of dollars before they even figured out what was happening on the screen or what they were supposed to be doing and moved on with a shrug; and the committed, who doggedly worked out the moves and battled their way to the end. With the novelty of the cartoon graphics now gone, neither group showed much interest in repeating the experience. As for the arcade industry: it would eventually stabilize and even recover somewhat, but those heady days circa 1981 would never return.
Even at its peak Dragon’s Lair never quite paid off for the folks who made it the way the hype might have made you think it did. Their lack of financial resources and the bankruptcy courts who had to approve Cinematronics’s every move kept them from fully capitalizing on the early publicity. They eventually ran out of the surplus, discontinued laser-disc players that Dyer had found, and had a terrible time getting new ones out of Pioneer. Cinematronics did manage to produce over 10,000 units over Dragon’s Lair‘s brief production run, a very impressive figure in a slumping arcade industry, but could probably have sold several times that if they could only have made them while the craze lasted. On the other hand, the game’s scarcity doubtlessly added to its mystique, and allowed Cinematronics to sell each unit for $4000, easily twice the industry’s going rate. Less ambiguously damaging were the technical faults that started to crop up after a few months. Dragon’s Lair worked its laser-disc player hard, sending the laser careening all over the disc for ten or twelve hours per day of constant use. Meanwhile the machine that housed it was getting constantly kicked, slapped, and jostled by angry or jubilant players (more of the former, one suspects, given the nature of the game). Pioneer had never planned for such conditions. The players started to fail in relatively short order, leaving Cinematronics scrambling to replace them, at considerable expense in money and in the precious new laser-disc players they had to use as replacements, for angry arcade owners who had just lost their cash cow.
The partners, like the industry as a whole, mistook player infatuation for commitment. Space Ace, which cost twice as much as Dragon’s Lair to make, did a bare fraction of the business. Development of a third game, Dragon’s Lair II, was halted in March of 1984. It was hoped that this would just be a temporary delay, to let the laser-disc scene shake itself out a bit and the substandard Dragon’s Lair knock-offs fade away. But by July Cinematronics couldn’t sell the Dragon’s Lair and Space Ace games that were now clogging their warehouse. Production had finally ramped up just in time for demand to cease. The world had moved on; Dragon’s Lair II was cancelled. The people who had planned to make it had no choice but to move on as well, although not without accusations and threats amongst the partners as everyone blamed everyone else for what had happened.
Cinematronics straggled on in the diminished arcade industry for more years than anyone might have expected before finally being acquired by another arcade survivor: WMS Industries, the company that had once been Williams Electronics of Defender fame.
Rick Dyer renamed his company RDI Video Systems to continue to pursue his original dream of The Fantasy Machine. He put together a laser-disc entertainment system for the home called Halcyon, or just Hal for short, a deliberate play on the computer HAL from 2001: A Space Odyssey; apparently he judged that people thinking of inviting Hal into their living rooms wouldn’t think too much about HAL’s running rather messily amok in the film.
Hal talked to you, and, if he was in a good mood, accepted a limited number of voice commands back in return. This feature was enough evidence for RDI to declare that he was “artificially intelligent,” again without seeming to think about where Hal’s AI got the poor Discovery crew in the movie. Dyer and one of his partners appeared with Hal on Computer Chronicles, giving what has to be one of the most uncomfortable product demonstrations ever. Hal refuses to understand host Stewart Cheifet when he says the simple word “one,” to the point that Cheifet finally just gives up and takes option two instead. Meanwhile co-host Gary Kildall, no slouch in matters of computer science, presses Dyer and his associate relentlessly to abandon their patently silly AI claim; they just cling to it all the tighter.
Dyer hoped to release a whole line of interactive laser discs for Halcyon, but only three were ever completed: a couple of football games that use real NFL footage, and Thayer’s Quest, a menu-driven interactive story that hews very close to Dyer’s original plans for the game that became Dragon’s Lair.
Halcyon as a whole is an amazing, bizarre, visionary, kooky creation years ahead of its practical time. As the coupe de grâce, RDI planned to sell it for a staggering $2200. It’s unclear whether any were actually sold on the open market before Dyer’s investors pulled the plug; if so, the numbers were truly miniscule. After Halcyon’s failure Dyer continued intermittently to work with interactive narratives, surfacing again in the mid-1990s with two adventure games, Kingdom: The Far Reaches and Kingdom II: Shadoan.
Don Bluth never had any real passion for videogames; it’s unfortunate that Dragon’s Lair has gone down in history as a Don Bluth creation, when in reality it was very much Rick Dyer’s vision. Even at the height of the game’s success Bluth always talked about it as a means to an end, a way to expose the arcade generation to the pleasures of classical animation rather than as a new type of entertainment in its own right. Short-lived as its success was, Dragon’s Lair served its purpose for Bluth. It did indeed become the dinghy that kept him afloat in the world of commercial animation until the opportunity to do another feature came along. Bluth found a backer in Steven Spielberg, whose Amblin Entertainment funded and released Bluth’s An American Tail in 1986. That film, along with the likes of The Brave Little Toaster and Who Framed Roger Rabbit, marked the beginning of a renaissance for animation on the big screen, paving the way for Pixar and a rejuvenated Disney to return the big-budget animated feature to the yearly blockbuster rolls in the 1990s. But another, more direct legacy of Dragon’s Lair probably didn’t thrill Bluth quite as much: the game was adapted into exactly the kind of knock-off Saturday-morning cartoon he loathed. Unfortunately for ABC, it debuted only in the fall of 1984, by which time the kids they were trying to reach had moved on long ago. It lasted for only one season of 13 episodes.
Coleco also saw little return for their investment in the Dragon’s Lair intellectual property. They had schemed on introducing a laser-disc player for their ColecoVision console and/or their ill-fated Adam home computer, but soon realized — shades of the $2200 Halcyon — it would just be too expensive to be practical. Instead they funded a completely new game for the Adam inspired by scenes from the original. It didn’t look as nice, but was probably more fun in the long run. That game turned out to be just the first — and arguably one of the best — of a long, confusing stream of games that have carried the Dragon’s Lair name since. When Readysoft released a version for the Amiga in late 1988 it was rightly seen as a landmark. As the first version that looked reasonably close to the laser-disc original, it marked just how far computer graphics had come in five years; soon we would be in the era of Pixar, when computers are used to create feature cartoons. But not all things change — the gameplay remained as simplistic as ever. Today Digital Leisure sells Dragon’s Lair, Space Ace, and Dragon’s Lair II, completed at last, in versions playable on anything from your Blu-Ray player to your iPhone. And yes, it’s still the same exercise in rote memorization it’s always been, with a few optional kindnesses to make the experience a bit less painful. Dragon’s Lair must be the most long-lived bad game in the history of the industry. Such is the power of nostalgia.
Dragon’s Lair makes an interesting study today not just as an historical curiosity or an example of style over substance, although it is both of those things. In addition to being one more crazy, unexpected offshoot of the original Adventure, that urtext of an industry, it’s an important early way station in gaming’s long relationship with movies; indeed, I believe it’s the first game to give itself the fraught title of “interactive movie.” The lesson that may seem obvious after playing Dragon’s Lair a few times is one that the industry would learn only slowly and painfully: non-interactive video is a problematic fit with an interactive medium, a subject we’ll undoubtedly explore in depth around here if we ever make it to the era of the lost and lamented (?) full-motion video games of the 1990s.
But for now let’s not judge Dragon’s Lairtoo harshly. It may not be much of a game, but, like so much of what I write about on this blog, it’s a great example of stretching available technology just as far as it will go and creating something kind of amazing in its time and place. For that golden six months in 1983, at least, that was more than enough. The impression it made on hearts and minds in that short span of time has fueled thirty years of nostalgia. Not bad for a 22-minute cartoon.
(As mentioned in the article, you can still buy various incarnations of Dragon’s Lair and associated games from Digital Leisure. You can also use some of these products as a key to let you play the games in their original form using the Daphne emulator. See that project’s website for more information.
Unless you’re an extremely patient and/or nostalgic sort, most of the games I’ve been writing about on this blog for over two years now are a hard sell as something to just pick up and play for fun. There have been occasional exceptions: M.U.L.E., probably my favorite of any game I’ve written about so far, remains as magical and accessible as it was the day it was made; some or most of the Infocom titles remain fresh and entertaining as both fictions and games. Still, there’s an aspirational quality to even some of the most remarkable examples of gaming in this era. Colorful boxes and grandiose claims of epic tales of adventure often far exceeded the minimalist content of the disks themselves. In another era we might levy accusations of false advertising, but that doesn’t feel quite like what’s going on here. Rather, players and developers entered into a sort of partnership, a shared recognition that, well, there are sharp limits to what we can actually do with these simple computers we all have, but we can fill in all of the missing pieces with determined imaginings of what we could someday actually be getting on those disks.
Which didn’t mean that developers weren’t positively salivating after technological advances that could turn more of their aspirations into realities. Progress, of course, did come. Between the trinity of 1977 and 1983, the year we’re on this blog as I write this, typical memory sizes on the relatively inexpensive 8-bit machines typical in homes increased from as little as 4 K to 48 K, with 64 K set to become the accepted minimum by 1984. The arrival of the Atari 400 and 800 in 1979 and the Commodore 64 in 1982 each brought major advances in audiovisual capabilities. Faster, more convenient disks replaced cassettes as the accepted standard storage medium, at least in North America. But other parts of the technological equation remained frozen, perhaps surprisingly so given the modern accepted wisdom about the pace of advancement in computing. Home machines in 1983 were still mostly based around one of the two CPUs found in the trinity of 1977, the Zilog Z80 or the MOS 6502, clocked at roughly the same speeds as in 1977. Thus, Moore’s Law notwithstanding, the processing potential that programmers had to work with remained for the moment frozen in place.
To find movement in this most fundamental part of a microcomputer we have to look to the more expensive machines. The IBM PC heralded the beginning of 16-bit microcomputing in 1981. The Apple Lisa of 1983 became the first mass-produced PC to use the state-of-the-art Motorola 68000, a chip which would have a major role to play in computing for the rest of the decade and beyond. Both the Lisa and an upgraded model of the IBM PC introduced in 1983, the PC/XT, also sported hard drives, which let them store several megabytes of data in constantly accessible form, and access it much more quickly and reliably than could be done with floppy disks. Still, these machines carried huge disadvantages to offset their technical advancements. The IBM PC and especially the PC/XT were, as noted, expensive, and had fairly atrocious graphics and sound even by the standards of 1983. The Lisa was really, really expensive, lacked color and sound, and was consciously designed to be as inaccessible to the hackers and bedroom coders who built the games industry as the Apple II was wide open. The advancements of the IBM PC and the Lisa would eventually be packaged into forms more useful to gamers and game developers, but for now for most gamers it was 8 bits, floppy disks, and (at best) 64 K.
Developers and engineers — and, I should note, by no means just those in the games industry and by no means just those working with the 8-bit machines — were always on the lookout for a secret weapon that might let them leapfrog some steps in what must have sometimes seemed a plodding pace of technological change, something that might let them get to that aspirational future faster. They found one that looked like it might just have potential in a surprising place: the world of ordinary consumer electronics. Or perhaps by 1983 it was not so surprising, for by then they had already been waiting for, speculating about, and occasionally tinkering with the technology in question for quite some years.
At the end of the 1960s, with the home-videocassette boom still years away, the American media conglomerate MCA and the Dutch electronics giant Phillips each coincidentally began working separately on a technology to encode video onto album-like discs using optical storage. The video would be recorded as a series of tiny pits in the plastic surface of the disc, which could be read by the exotic technology of a laser beam scanning it as the disc was rotated. The two companies decided to combine their efforts after learning of one another’s existence a few years later, and by the mid-1970s they were holding regular joint demonstrations of the new technology, to which they gave the perfect name for the era: DiscoVision.
A DiscoVision prototype in action
Laser discs, as they came to be more commonly called, were however painfully slow to reach the market. A few pilots and prototype programs aside, the first real consumer-grade players did not reach stores in numbers until late 1980.
The Pioneer VP-1000, most popular of the early consumer-grade laser-disc players
By that time VCRs were selling in huge numbers. Laser discs offered far superior video and audio than VCRs, but, at least from the standpoint of most consumers, had enough disadvantages to more than outweigh that. They were much more expensive for starters. And they could only hold about 30 minutes of video on a side; thus the viewer had to get up and flip or change the disc, album-style, three times over the course of a typical movie. This was a hard sell indeed to a couch-loving nation who were falling in love with their new remote controls as quickly as their VCRs. But it was likely the thing that the movie and television industry found most pleasing about the laser disc that really turned away consumers: the discs were read only, meaning it was impossible to record from the television, or to copy and swap movies and other programs with friends. Some (admittedly anecdotal) reports claim that up to half of the laser-disc players sold in the early years of the format were returned when their purchasers realized they couldn’t use them to record.
Thus the laser-disc format settled into a long half-life in which it never quite performed up to expectations but never flopped so badly as to disappear entirely. It became the domain of the serious cinestas and home-theater buffs who were willing to put up with its disadvantages in return for the best video and audio quality you could get in the home prior to the arrival of the DVD. Criterion appeared on the scene in 1984 to serve this market with a series of elaborate special editions of classic films loaded with the sorts of extras that other publishers wouldn’t begin to offer until the DVD: cast and crew interviews, “making of” documentaries, alternate cuts, unused footage, and of course the ubiquitous commentary track (like DVDs, laser discs had the ability to swap and mix audio streams). Even after DVDs began to replace VCRs en masse and change everything about home video circa 2000, a substratum of laser-disc loyalists soldiered on, some unwilling to give up on libraries they’d spent many years acquiring, others convinced, like so many vinyl-album boosters, that laser discs simply looked better than the “colder” digital images from DVDs or Blu-ray discs. (Although all of these mediums store data using the same basic optical techniques, in a laser disc the data is analog, and is processed using analog rather than digital circuitry.) Pioneer, who despite having nothing to do with the format’s development became its most consistent champion and was eventually responsible for more than half of all players sold, surprised those who already thought the format long dead in January of 2009 when they announced that they were discontinuing the last player still available for new purchase.
The technology developed for the laser disc first impacted the lives of those of us who didn’t subscribe to Sound and Vision in a different, more tangential way. Even as DiscoVision shambled slowly toward completion during the late 1970s a parallel product was initiated at Phillips to adapt optical-storage technology into an audio disc. Once again Philips soon discovered another company working on the same thing, this time Sony of Japan, and the two elected to join forces. Debuting in early 1983, the new compact disc was first a hit mainly with the same sorts of technophiles and culture vultures who were also likely to purchase laser-disc players. Unlike the laser disc, however, its trajectory didn’t stop there. By 1988 400 million CDs were being pressed each year, at which time the format’s was just on the verge of its real explosion in popularity; nine years later that number was 5 billion, close to one CD for every person on the planet.
But now let’s back up and relate this new optical audiovisual technology to the computer technologies we’re more accustomed to spending our time with around these parts. Many engineers and programmers have an epiphiny after working with electronics in general or computers in particular for a certain amount of time. Data, they realize, whether it represents an audio recording, video, text, or computer code, is ultimately just data. To a computer in particular it’s all just a stream of manipulatable numbers. The corollary to this fact is that a medium developed for the storage of one sort of data can be repurposed to store something else. Microcomputers in particular already had quite a tradition of doing just that even in 1983. The first common storage format for these machines was ordinary cassette tapes, playing on ordinary cassette players wired up to Altairs, TRS-80s, or Apple IIs. The data stored on these tapes, which when played back for human ears just sounded like a stream of discordant noise, could be interpreted by the computer as the stream of 0s and 1s which it encoded. It wasn’t the most efficient of storage methods, but it worked — and worked with a piece of cheap equipment found lying around in virtually every household, a critical advantage in those do-it-yourself days of hobbyist hackers.
If a cassette could be used to store a program, so could a laser disc. Doing so had one big disadvantage compared to other storage methods, the very same that kept so many consumers away from the format: unless you could afford the complicated, specialized equipment needed to write to them yourself, discs had to be stamped out from a special factory complete with their contents, which afterward could only be read, not altered. But the upside… oh, what an upside! A single laser-disc side may have been good for only about 30 minutes of analog video, but could store about 1 to 1.5 GB of digital computer code or data. The possibility of so much storage required an adjustment of the very scale of one’s thinking; articles even in hardcore magazines like Byte that published the figure had to include a footnote explaining what a gigabyte actually was.
Various companies initiated programs in the wake of the laser disc’s debut to adopt the technology to computers, resulting in a plethora of different incompatible media and players. As Edward Rothchild wrote in Byte in March of 1983, “Discs are being made now in 12- and 14-inch diameters with 8-, 5 1/4-, 3-, and possibly 2-inch discs likely in the near future.”
The Toshiba DF-2000, a typically elaborate optical-storage-based institutional archiving system of the 1980s
Others moved beyond discs entirely to try cards, slides, even optical versions of old-fashioned reel-to-reel or cassette tapes. Some of the ideas that swirled around were compelling enough that you have to wonder why they never took off. A company called Drexler came out with the Drexon Laser Card, a card the size of a driver’s license or credit card with a strip at the back that was optical rather than magnetic and could store some 2 MB of data. They anticipated PCs of the future being equipped with a little slot for reading the cards. Amongst other possibilities, a complete operating system could be installed on a card, taking the place of ROM chips or an operating system loaded from disk. Updates would become almost trivial; the cards were cheap and easy to manufacture, and the end user would need only swap the old for the new to “install” the new operating system. Others anticipated Laser Cards becoming personal identification cards, with everything anyone could need to know about you, from citizenship status to credit rating, right there on the optical strip, a helpful boon indeed in the much less interconnected world of the early 1980s. (From the department of things that never change: the privacy concerns raised by such a scheme were generally glossed over or ignored.)
The Drexon Laser Card
Some of these new technologies, the Laser Card alas not among them, did end up living on for quite some years. Optical storage is ideal for large, static databases like public records, especially in large institutions that could afford the technology needed to create the discs as well as read them. IBM and others who served the institutional-computing market therefore came out with various products for this purpose, some of which persist to this day. In the world of PCs, however, progress was slow. It could be a bit hard to say what all that storage might actually be good for on a machine like, say, an Apple II. Today we fill our CDs and DVDs mostly with graphics and sound resources, but if you’ve seen the Apple II screenshots that litter this blog you know that home computers just didn’t have the video (or audio) hardware to make much direct use of such assets. Nor could they manipulate more than the most miniscule chunk of the laser disc’s cavernous capacity; connecting an Apple II to optical mass storage would be like trying to fill the family cat’s water bowl with a high-pressure fire hose. Optical media as a data-storage medium therefore came to PCs only slowly. When it did, it piggybacked not on the laser disc but on the newer, more successful format of the audio CD. The Yellow Book standard for the storage of data on CDs was published in 1985, accompanied by the first trickle of pioneering disc-housed encyclopedias and the like, and Microsoft hosted the first big conference demonstrating its potential in March of 1986. It took several more years to really catch on with the mass market, but by the early years of the 1990s CD-ROM was one of the key technologies at the heart of the “multimedia PC” boom. By this time processor speeds, memory sizes, and video and sound hardware had caught up and were able to make practical use of all that storage at last.
Still, even in the very early 1980s laser discs were not useless to even the most modest of 8-bit PCs. They could in fact be used to great effectiveness in a way that hewed much closer to their original intended purpose. Considered as a video format, the most important property of the laser disc to understand beyond the upgrade in quality it represented over videotape is that it was a random-access medium. Videocassettes and all other, older mediums for film and video, by contrast, were linear formats. One could only unspool their contents sequentially; finding a given bit of content could only be accomplished via lots of tedious rewinding and fast forwarding. But with a laser disc one could jump to any scene, any frame, immediately; freeze a frame on the screen; advance forward or backward frame by frame or at any speed desired. The soundtrack could be similarly manipulated. This raised the possibility of a new generation of interactive video, which could be controlled by a computer as cheap and common as an Apple II or TRS-80. After all, all the computer had to do was issue commands to the player. All of the work of displaying the video on the screen, so far beyond the capabilities of any extant computer’s graphics hardware, was neatly sidestepped. For certain applications at least it really did feel like leapfrogging about ten years of slow technological progress. Computers could manage graphics and sound through manipulating laser-disc players that they wouldn’t be able to do natively until the next decade.
The people who worked on the DiscoVision project were not blind to the potential here. Well before the laser disc became widely available to consumers in 1980 they were already making available pre-release and industrial-grade models to various technology companies and research institutions. These were used for the occasional showcase, such as the exhibition at Chicago’s Museum of Science and Industry in 1979 which let visitors pull up an image of the front page from any edition of the Chicago Tribune ever published. They continued to offer throughout the 1980s admittedly pricey professional-level units that came equipped with a CPU and a modest amount of RAM memory. These could take instructions from their controlling computers and also talk back, telling what frame was currently playing, notifying the host when a particular snippet was finished, etc. The host could even load a simple program into the player’s memory and let it run unattended. Consumer-grade devices were more limited, but virtually all came equipped with a key feature: a remote-control sensor, which could be re-purposed to let a computer control the player. Such control was more limited than was possible with the more expensive players — no talking back on the part of the player was possible. Still, it was intriguing stuff. Magazines like Byte and Creative Computing started publishing schematics and software to let the home user take control of her shiny new laser-disc player just months after the devices started becoming available to purchase in the first place. But, given all of the complications and the need to shoot video as well as write code and hack hardware to really create something, much of the most interesting work with interactive video was done by larger institutions. Consider, for example, the work the American Heart Association’s Advanced Technology Development division did.
The AHA was eager to find a way to increase the quantity and quality of CPR training in the United States, and for very good reason: in 1980 it was estimated that an American stricken with a sudden heart attack had odds of 18 to 1 against there being someone on-hand who could use CPR to sustain her life. Yet CPR training from a human instructor is logistically complicated and expensive. Many small-town governments and/or hospitals simply couldn’t manage to provide it. David Hon of the AHA believed that interactive video could provide the answer. The system his research group developed consisted of an Apple II interfaced to a laser-disc player as well as a mannequin equipped with a variety of sensors. An onscreen instructor taught the techniques of CPR step by step. After each step the system quizzed the student on what she had just learned; she could select her answers by touching the screen with a light pen. It then let her try it out on the mannequin until she had it down. The climax of the program came with a simulation of an actual cardiac emergency, complete with video and audio, explicitly designed to be to be exciting and dramatic. Hon:
We had learned something from games like Space Invaders: if you design a computer-based system in such a way that people know the difference between winning and losing, virtually anyone will jump in and try to win. Saving a life is a big victory and a big incentive. We were sure that if we could build a system that was easy to use and engaging trainees would use it and learn from it willingly.
The trainee’s “coach” provides instruction and encouragement on the left monitor; the right shows the subject’s vital signs as the simulation runs
And indeed, at a cost of about $15,000 per portable system, the scheme turned out to be a big success, and probably saved some real lives.
One limitation of many early implementations of interactive video like this was the fact that the computer controller and the laser disc itself each had its own video display, with no way to mix the two on one screen, as you can clearly see in the photos above. In time, however, engineers developed the genlock, a piece of hardware which allowed a computer to overlay its own signal onto a video display. How might this be useful? Well, consider the very simple case of an educational game which quizzes children on geography. The computer could play some looping video associated with a given country sourced from the laser disc, while asking the player what country is being shown in text generated by the computer. Once the player answers, more text could be generated telling whether she got it right or not. But many saw such a program as representing the merest shadow of interactive video’s real potential. A group at the University of Nebraska developed a flight-training system which helped train prospective pilots by combining video and audio from actual flights with textual quizzes asking, “What’s happening here?” or “What should be done next?” or “What do these instruments seem to indicate?”
Another University of Nebraska group developed a series of educational games meant to teach problem-solving to hearing-impaired children. They apparently played much like the full-motion-video adventure games of a decade later, combining video footage of real actors with puzzles and conversation menus to let the child find her own way through the story and solve the case.
The Minnesota Educational Computing Consortium (the same organization that distributed The Oregon Trail) developed a high-school economics course:
Three types of media are used in each session. A booklet introduces the lesson and directs the student to use the other pieces of equipment. At the same time, it provides space for note taking and record keeping. A microcomputer [an Apple II] contributes tutorial, drill, and practice dimensions to the lesson. And a videodisc player presents information, shows examples, and develops concepts which involve graphics or motion.
Apple themselves built commands for controlling laser-disc players into their SuperPILOT programming language, a rival to BASIC designed specifically for use in schools.
There was a widespread sense amongst these experimenters that they were pioneering a new paradigm of education and of computing, even if they themselves couldn’t quite put their fingers on what it was, what it meant, or what it should be called. In March of 1976, an amazingly early date when laser discs existed merely as crude prototypes, Alfred M. Bork envisioned what laser discs could someday mean to educational computing in an article that reads like a dispatch from the future:
I envision that each disc will contain a complete multimedia teaching package. Thus, a particular disc might be an elaborate teaching sequence for physics, having on the disc the computer code for that sequence (including possible microcode to make the stand-alone system emulate the particular machine that material was originally developed for), slides, audio messages, and video sequences of arbitrary length, all of these in many different segments. Thus, a teaching dialog stored on a videodisc would have full capability of handling very complex computer logic, and making sizable calculations, but it also could, at an appropriate point, show video sequences of arbitrary length or slides, or present audio messages. Another videodisc might have on it a complete language, such as APL, including a full multimedia course for learning APL interactively. Another might have relatively little logic, but very large numbers of slides in connection with an art-history or anatomy course. For the first time control of all the important audiovisual media would be with the student. The inflexibility of current film and video systems could be overcome too, because some videodiscs might have on them simply nothing but a series of film clips, with the logic for students to pick which ones they wanted to see at a particular time.
Bork uses a critical word in his first sentence above, possibly for the first time in relation to computing: “multimedia.” It’s a word that wouldn’t become commonplace until many years after Bork wrote this passage. Tony Feldman provided perhaps the most workable and elegant definition in 1994: “[Multimedia is] a method of designing and integrating computer technologies on a single platform that enables the end user to input, create, manipulate, and output text, graphics, audio, and video utilizing a single user interface.” This new paradigm of multimedia computing is key to almost all of the transformations that computers have made in people’s everyday lives in the thirty years that have passed since the pioneering experiments I just described. The ability to play back, catalog, combine, and transform various types of media, many or most of them sourced from the external world rather than being generated within the computer itself, is the bedrock at the root of the World Wide Web, of your iPod and iPhone and iPad (or equivalent). Computers today can manipulate all of that media internally, with no need for the kludgy plumbing together of disparate devices that marks these early experiments, but the transformative nature of the concept itself remains. With these experiments with laser-disc-enabled interactive video we see the beginning of the replacement of the old analog world of sold-state electronics with our current digital world of smart, programmable media devices. That, much more than gigabytes of storage, is the real legacy of DiscoVision.
But of course these early experiments were just that, institutional initiatives seen by a relative few. There simply weren’t enough people with laser-disc players wired to their PCs for a real commercial market to develop; the process of getting a multimedia-computing setup working in the home was just too expensive and convoluted. It would be six or seven years more before “multimedia” became the buzzword of the zeitgeist — only to be quickly replaced in the public’s imagination by the World Wide Web, that further advance that multimedia enabled.
In the meantime, most people of the early 1980s had their first experience with this new paradigm of computing outside the home, in the form of — what else? — a game. We’ll talk about it next time.
(The most important sources for this article were: Byte magazines from June 1982, March 1983, and October 1984; Creative Computing from March 1976 and January 1982; Multimedia, a book by Tony Feldman; Interactive Video, a volume from 1989 in The Educational Technology Anthology Series; and various laser-disc-enthusiast sites on the Internet. I also lifted some of these ideas from my own book about the Amiga, The Future Was Here. The lovely picture that begins this article was on the cover of the June 1982 Byte. All of the other images were also taken from the various magazines listed above.)
There’s a lot of interesting stuff to talk about in Ultima III, to the extent that I wasn’t quite sure how to wedge it all into a conventional review. So I decided to try this approach, to balance my usual telling with quite a bit of showing. Or something like that. Anyway, I found it fun to do.
If you’re inspired to play Ultima III yourself, know that Good Old Games is selling it in a collection which also contains Ultima I and II. Less legitimately, there are the usual abandonware sites and ROM collections where you can find the original Apple II version that I play here, but you’re on your own there. Some spoilers do follow, although Ultima III is tricky enough that you may just welcome whatever little bit of guidance you glean from this post.
Garriott was really proud of his game’s subtitle, Exodus, to the extent that in the game itself and most early advertising it’s actually more prominent than the Ultima name. He draws no connection to its meaning as an English noun or to the Bible. It’s simply a cool-sounding word that he takes as the name of his latest evil wizard, the love child of his two previous evil wizards, Mondain from Ultima I and Minax from Ultima II. Roe R. Adams III did make a somewhat strained attempt to draw a connection to the expected implications of the word in the manual via a recasting of an old seafaring mystery:
One possible clue as to the identity of thy nemesis has been discovered. A derelict merchant ship was recently towed into port. No crewmen were aboard, alive or dead. Everyone had vanished, as if plucked by some evil force off the boat. The only thing found was a word written in blood on the deck: EXODUS.
I never hear anything about this ghost ship in the game itself. Also left unexplained, as it was in Ultima II, is why Mondain was on Garriott’s fantasy world of Sosaria and Minax was on our own Earth. This time I’m stuck back on Sosaria again. Garriott would finally get more serious about making an Ultima mythos that makes some kind of sense with the next game, but for now… let’s just say I won’t be spending much more time discussing the plotting or the worldbuilding.
In Ultima III I get to create and control a full party of four adventurers rather than a single avatar. This is actually the only Ultima that works quite this way. Later games would use the code Garriott first developed here to allow players to have more than one person in their parties, but would start them off with a single avatar. Finding other adventurers in the game world itself and convincing them to join would become part of the experience of play and an important component of those games’ much richer plots.
With my party created, I’m dumped into Sosaria, right outside the town of Britain and the castle of Lord British in what has already become by Ultima III a time-honored tradition.
One of the fascinating aspects of playing through the Ultima games in order is seeing which pieces are reused from earlier games and which are replaced. Programming often really is a game of interchangeable parts. On the left above is Ultima II, on the right Ultima III. The same old tile engine that dates back to Ultima I is still in place in both games, but Ultima III changes the screen layout considerably and makes everything a bit more attractive and ornate within the considerable limitations of the Apple II. It no longer uses the Apple II’s mixed display mode that displays text rather than graphics on the bottom four lines of the screen. Instead the whole screen is now given over to a graphics display, with a character generator, once an exotic piece of technology but by 1983 commonplace, used to put words anywhere on the screen.
When I enter a town for the first time another of Ultima III‘s additions to the old tile-graphics engine becomes clear: a line-of-sight algorithm now prevents me from seeing through walls. This adds an extra dimension of realism, but proves to be a mixed blessing. We’ll talk about why that is in just a little bit.
And when I run into a couple of wandering orcs for the first time I see another big addition: a separate strategic-combat screen that pops up when a fight begins. You can see that on the right above; the old Ultima II system of flailing in place on the map screen is on the left. The earlier system would obviously be unworkable with a party of four. Unlike with Wizardry, combat has never been the heart of Ultima‘s appeal, but that doesn’t mean you don’t spend a lot of time — maybe too much time — in Ultima III engaging in it. The new system does add some welcome interest to the old formula. I can now move each character about individually, use missile weapons (a highly recommended strategy that lets me take out many monsters before they can get close enough to damage me), and cast quite a variety of offensive and defensive spells. Less wonderfully, all those random encounters with orcs and cutthroats now take much more time to resolve, which is one of the things that can turn Ultima III into quite the slog by the time all is said and done. Also contributing to the tedium: in a harbinger of certain modern CRPGs, random encounters are balanced to suit the general potency of my party, thus guaranteeing that they will still take some time even once I have quite a powerful group of characters.
As part of a general tightening of the game’s mechanics likely prompted by unfavorable comparisons of previous Ultimas to previous Wizardries, the strange system of hit points as a commodity purchasable from Lord British has finally been overhauled. Now healing works as you might expect: each character has a maximum number of hit points which Lord British raises by 100 every time I visit him after gaining a level. Alas, this works only until level 25 and 2500 hit points. At least I don’t have to pay him for his trouble anymore. In the screenshot above his “Experience more!” means that I haven’t yet gained a level for him to boost my hit-point total; small wonder, as all my characters are still level 1.
Having gotten the initial lay of the land, I settle into the rhythm of building my characters, exploring the world map, and talking to everyone I can find in the towns. The latter process, like so much in Ultima III, is equal parts frustrating and gratifying. The good citizens of Sosaria insist on speaking in the most cryptic of riddles. And here we see the darker side of Garriott’s new line-of-sight system: most of the most vital clue-givers are tucked away in the most obscure possible corners of the towns, like the fellow shown in the screenshot above and left. I have to scour every town square by tedious square to be absolutely certain I haven’t missed a vital clue, a vital link in a chain of tasks required to win that is much more complicated than those found in the earlier games. On the other hand, the gratification that comes when another piece of the puzzle falls into place is considerable. Ultima has always been better at delivering that thrill of exploration than just about any other CRPG.
There are in many places in Ultima III some small kindnesses, some elements that, once I figure out how they work, can make things easier. In the screenshot to the right I’m using a magic gem, purchasable from thieves guilds in a couple of the towns, to get a bird’s-eye view of the town I’m currently in. Ferreting out these secrets and hidden mechanics contributes to another thing Ultima always does well: making you feel smart.
Still, it’s possible to take this whole discovery thing too far. In one of the more astonishing design decisions in Ultima III, Garriott has consciously engineered into his hotkey-driven interface an element of guess the verb. After all, why should text adventurers have all the fun? There’s a mysterious OTHER command this time, which lets me enter new verbs. Divining what these are depends on my sussing that words surrounded by “<>” in characters’ speech refer to new verbs. (“<SEARCH> the shrines.”) A very strange design choice, which does a good job of illustrating the gulf in player expectations between now and then, when guess the verb was still trumpeted by many as an essential element of adventure games rather than just a byproduct of their technical limitations. Given that, why not try to engineer it into Ultima, a series which always tried to offer more, more, more? Thankfully, it would disappear again from Ultima IV, in what could be read as another reflection of changing player expectations.
In the screenshot at left above I’ve just used the hidden verb “BRIBE” to convince a guard who just a second before was standing right next to me to go away for the modest fee of 100 gold. Now I can go into the shop and steal with relative impunity. (Ultima III is, as we’ll continue to see, very much an amoral world, the last Ultima about which that can be said.) Bribing is only useful; other hidden verbs are vital.
For instance, the second screenshot above shows me gathering a piece of important information using the hidden verb “PRAY” inside a temple. This is actually quite an interesting sequence. PRAYing yields the information that I must YELL — YELL being one of the standard hotkey-based commands — “EVOCARE” at a certain place. It’s perilously closed to two guess-the-verb — or at least guess-the-word — puzzles joined together.
We see an interesting re-purposing of previous Ultima technology in the form of the eight moon gates which wink in and out of existence in a set pattern on the world map. In Ultima II, you may recall, these supposedly allowed me to travel through time, although effectively they just provided access to different world maps; nothing I did in one time could have any direct effect on any of the others. Here they’re renamed and used more honestly, as ways to move quickly from place to place on the primary world map. (There are only two world maps this time, the primary one and an alternate world called Ambrosia which we’ll get to shortly.) They also allow me to reach a few places that are otherwise completely inaccessible, as the screenshot at right above illustrates. Well, okay… I could also get there with a ship, an element we’ll talk about later. But that’s not always the case; there’s at least one vital location that can be visited only via moon gate. Thus understanding the logic of the moon gates and charting their patterns is another critical aspect of cracking the puzzle of Ultima III. Moon gates would continue to be a fixture in the Ultimas to come.
Garriott had completely rewritten his dungeon-delving engine for Ultima II, replacing what had been the slowest and most painful part of Ultima I with a snappy new piece that replaced a wire-frame portrait of the surroundings with glorious filled-in color. It’s easily the most impressive and appreciated improvement in that game. But then, like so much else in Ultima II, he squandered it by giving his players no reason to go there. Thus Ultima III almost feels like the new dungeon engine’s real debut. Not only can I harvest a lot of desperately needed gold from the dungeons, but I must also explore them to find five vital “marks” that give special abilities which are in turn key to solving the game. And at the bottom of the Dungeon of Time I meet the Time Lord. (Garriott’s Time Bandits fixation had apparently not yet completely runs its course — or are we now dealing with a Doctor Who obsession?) He gives a portentous clue that will be vital to the end-game.
Sosaria is still a world where might makes right. Lord British, the supposedly benevolent monarch, has a dirty little secret, an ugly torture chamber hidden in the depths of his castle. It’s almost enough to make you ask who’s really the evil one here. The manual talks a good game about Exodus, but he doesn’t actually do anything at all in the game itself, just hangs out in his castle and waits for us to come kill him. Meanwhile Lord British has torture chambers, and his lands are best with monsters trying to kill me, and he seems completely disinterested in helping me beyond boosting my hit points from time to time. Nor am I exactly morally pure: my own mission in the torture chamber is not to save the fellow who’s been thrown into a lake of fire, merely to extract some information from him.
The screenshot at the right shows an even more morally questionable episode, albeit one that requires a bit more explanation. I’m the one on the horse. Each of the three clerics next to me has a critical clue to convey. However, I can’t interact on a diagonal, meaning that the one at bottom right is inaccessible to me — unless I open up a lane by killing one of his companions in cold blood, that is. I want to emphasize here that the clue the inaccessible cleric has to offer is absolutely necessary; he tells where to dig for some special weapons and armor that provide the only realistic way to survive the end-game in Exodus’s castle. Thus the only way forward is, literally, murder, and it’s a conscious design choice on Garriott’s part. Of course, he didn’t think of it quite that way. He just saw it as an interesting mechanic for a puzzle, having not yet made the leap himself from mechanics to experiential fiction. Again, all of that would change with Ultima IV.
Speaking of horses: given Garriott’s newfound willingness to edit, the vehicles available to me in Ultima III are neither so plentiful nor so outrageous as they were in Ultima II. The ridiculous and ridiculously cool airplane, for instance, is gone.
I can buy horses for my party in a couple of towns. These let me move overland a bit faster, using less food and avoiding many of the wandering monsters and the endless combats they bring which can test the patience of the hardiest of players. A ship can be acquired only by taking it from one of the roving bands of pirates that haunt the coastline. There aren’t actually a lot of pirates about, which can get very frustrating; a ship is required to visit several important areas of the game, and finding one can be tough. In the right-hand screenshot above I’ve sailed to an island, where, following the lead of the cleric whose companion I killed in cold blood, I’ve dug up the aforementioned special weapons that are required to harm Exodus’s innermost circle of minions.
I also need a ship to get to the alternate world of Ambrosia, which I can manage only by the counter-intuitive step of sailing into a whirlpool. Here I find shrines to each of the four abilities, the only ways to raise my scores above their starting values. Doing so is vital; in Ultima III‘s still somewhat strange system, ability scores have much more effect on my performance in combat and other situations than my character level. For instance, the number and power of spells I can cast has nothing to do with my level, only with my intelligence (wizard spells) or wisdom (cleric spells).
The explicitly Christian imagery in these shrines, and occasionally in other places in the game, is worth noting. It’s doubtless a somewhat thoughtless result of Garriott’s SCA activities and his accompanying fascination with real medieval culture, but it could certainly be read as disrespectful, a trivializing of religious belief. It’s the sort of thing that TSR, creators of Dungeons and Dragons, were always smart enough to stay well away from (not that it always helped them to avoid controversy). Similarly, you definitely will never see crosses in a big-budget modern fantasy CRPG.
Ready at last, I piece together a string of clues and sail to the “Silver Snake”. There I yell the password “EVOCARE” to enter Exodus’s private grotto. The Silver Snake itself provides a good illustration of just how intertwined the early Ultima games were with Garriott’s own life. And the anecdote that explains its presence here also shows some of the difficulties of trying to pin down the facts about Garriott’s life and career.
Growing up in Houston in the mid-1970s, Garriott was one of the few people to see the infamously awful adventure film Doc Savage: The Man of Bronze. Members of the lost Central American tribe that Savage battles in the movie all bear a tattoo on their chest of the Mayan god Kulkulkan, about whom little is known today apart from his symbol: a serpent.
Young Richard thought the symbol so cool-looking that he went to his mother’s silversmithing workshop in that room above his family’s garage that would one day house Origin Systems and made the design — or as close an approximation as he could manage — for himself. He put his new amulet on a chain made from one of his mother’s belts. He told Shay Addams about it circa 1990:
“And this chain now resides around my neck 365 days a year, 24 hours a day — it has essentially remained there for the rest of my life ever since the day I put it on. There is no way to remove it without taking a screwdriver to it and prying open one of the links. For the first couple of years that I wore it, I actually had a link that I used to open and close a little bit. After I realized I was wearing out something by doing that, I quit doing it, so this necklace has remained here ever since. It literally never comes off. The chain was gold-colored with I first put it on. As it wears off, the colors keep changing, and now it rusts on my neck. I mean literally, every day. When I go, I may die of rust poisoning or something.”
Shortly after finishing Ultima III, Garriott loaned the original to his father Owen to carry with him on his second and final trip into space. It went into space again with Richard himself in 2008, and it seems that he still wears it frequently if not constantly. For what it’s worth, the color now seems to be a dull silver, almost a pewter shade.
But… wait. A close look at the early portrait of Origin Systems I published earlier shows that he doesn’t seem to be wearing it there, although Ken Arnold is using either the original or a duplicate as a key ring. Various other contemporary photos show no evidence of a chain or amulet, at least not of the construction and bulk of the one he wears to public appearances in recent years. Now, you could say that to even question this is petty, and in a very real sense you’d be right. Really what does it matter whether he never takes the serpent medallion off or whether it’s merely a precious link to his past that he wears on special occasions? I mention it here only because it points to how slippery everything involving Garriott can be, how much the man often seems to prefer SCA-style legend over the messier world of historical facts, and by extension how eager his interviewers and chroniclers often are to mythologize rather than document. That in turn forces me to spend far more time than I’d like to debunking or at least double-checking everything he says and much of what is said about him. But we’ve moved far afield from Ultima III now, so enough beating of this particular dead horse.
As I’ve mentioned before, Garriott excised most of the anachronistic science-fiction elements from Ultima III to focus on fantasy. But notice that I said “most.” When I get to the grand climax at last, I learn that Exodus apparently is in fact… a giant deranged computer in the tradition of Star Trek. The four magic cards I quested for were apparently punched cards — Exodus is an old-fashioned evil computer — that I need to use to shut him down or change his programming or… something. Of course, none of this make a lick of sense — how did Mondain and Minax manage to breed a computer child? But I dutifully insert the cards and shut him down, and am left to “speculation” about Ultima IV.
In that spirit, let’s note that Garriott himself sees the Ultimas through Ultima III as essentially technical exercises, written “to satisfy my personal interest in seeing how much better a game I could put together with the skills I’d acquired while creating the previous game.” While his technology would continue to improve, with Ultima III it reached a certain point of fruition at which it was capable of delivering more than an exercise in rote mechanics, was capable of sustaining real experiential fictions. Garriott didn’t entirely realize that at the time he was writing Ultima III, and thus the game takes only the most modest of steps in that direction. When he started on the next one, however, it would all come home. In a way, it’s with that game that Ultima really became Ultima as we remember it today. We have much else to talk about before we get there, but I hope you’ll still be around when we do. With Ultima III Garriott had his foundation in place. Next would come the cathedral.