RSS

Search results for ‘infocom’

The Later Years of Douglas Adams

If God exists, he must have a sense of humor, for why else would he have strewn so many practical jokes around his creation? Among them is the uncanny phenomenon of the talented writer who absolutely hates to write.

Mind you, I don’t mean just the usual challenges which afflict all of us architects of sentences and paragraphs. Even after all these years of writing these pieces for you, I’m still daunted every Monday morning to face a cursor blinking inscrutably at the top of a blank page, knowing as I do that that space has to be filled with a readable, well-constructed article by the time I knock off work the following Friday evening. In the end, though, that’s the sort of thing that any working writer knows how to get through, generally by simply starting to write something — anything, even if you’re pretty sure it’s the wrong thing. Then the sentences start to flow, and soon you’re trucking along nicely, almost as if the article has started to write itself. Whatever it gets wrong about itself can always be sorted out in revision and editing.

No, the kind of agony which proves that God must be a trickster is far more extreme than the kind I experience every week. It’s the sort of birth pangs suffered by Thomas Harris, the conjurer of everybody’s favorite serial killer Hannibal Lecter, every time he tries to write a new novel. Stephen King — an author who most definitely does not have any difficulty putting pen to paper — has described the process of writing as a “kind of torment” for his friend Harris, one which leaves him “writhing on the floor in frustration.” Small wonder that the man has produced just six relatively slim novels over a career spanning 50 years.

Another member of this strange club of prominent writers who hate to write is the Briton Douglas Adams, the mastermind of The Hitchhiker’s Guide to the Galaxy. Throughout his career, he was one of genre fiction’s most infuriating problem children, the bane of publishers, accountants, lawyers, and anyone else who ever had a stake in his actually sitting down and writing the things he had agreed to write. Given his druthers, he would prefer to sit in a warm bath, as he put it himself, enjoying the pleasant whooshing sound the deadlines made as they flew by just outside his window.

That said, Adams did manage to give outsiders at least the impression that he was a motivated, even driven writer over the first seven years or so of Hitchhiker’s, from 1978 to 1984. During that period, he scripted the twelve half-hour radio plays that were the foundation of the whole franchise, then turned them into four novels. He also assisted with a six-episode Hitchhiker’s television series, even co-designed a hit Hitchhiker’s text adventure with Steve Meretzky of Infocom. Adams may have hated the actual act of writing, but he very much liked the fortune and fame it brought him; the former because it allowed him to expand his collection of computers, stereos, guitars, and other high-tech gadgetry, the latter because it allowed him to expand the profile and diversity of guests whom he invited to his legendary dinner parties.

Still, what with fortune and fame having become something of a done deal by 1984, his instinctive aversion to the exercising of his greatest talent was by then beginning to set in in earnest. His publisher got the fourth Hitchhiker’s novel out of him that summer only by moving into a hotel suite with him, standing over his shoulder every day, and all but physically forcing him to write it. Steve Meretzky had to employ a similar tactic to get him to buckle down and create a design document for the Hitchhiker’s game, which joined the fourth novel that year to become one of the final artifacts of the franchise’s golden age.

Adams was just 32 years old at this point, as wealthy as he was beloved within science-fiction fandom. The world seemed to be his oyster. Yet he had developed a love-hate relationship with the property that had gotten him here. Adams had been reared on classic British comedy, from Lewis Carroll to P.G. Wodehouse, The Goon Show to Monty Python. He felt pigeonholed as the purveyor of goofy two-headed aliens and all that nonsense about the number 42. In So Long, and Thanks for All the Fish, the aforementioned fourth Hitchhiker’s novel, he’d tried to get away from some of that by keeping the proceedings on Earth, delivering what amounted to a magical-realist romantic comedy in lieu of another zany romp through outer space. But his existing fans hadn’t been overly pleased by the change of direction; they made it clear that they’d prefer more of the goofy aliens and the stuff about 42 in the next book, if it was all the same to him. “I was getting so bloody bored with Hitchhiker’s,” Adams said later. “I just didn’t have anything more to say in that context.” Even as he was feeling this way, though, he was trying very hard to get Hollywood to bite on a full-fledged, big-budget Hitchhiker’s Guide to the Galaxy feature film. Thus we have the principal paradox of his creative life: Hitchhiker’s was both the thing he most wanted to escape and his most cherished creative comfort blanket. After all, whatever else he did or didn’t do, he knew that he would always have Hitchhiker’s.

For a while, though, Adams did make a concerted attempt to do some things that were genuinely new. He pushed Infocom into agreeing to make a game with him that was not the direct sequel to the computerized Hitchhiker’s that they would have preferred to make. Bureaucracy was rather to be a present-day social satire about, well, bureaucracy, inspired by some slight difficulties Adams had once had getting his bank to acknowledge a change-of-address form. Meanwhile he sold to his book publishers a pair of as-yet unwritten non-Hitchhiker’s novels, with advances that came to about $4 million combined. They were to revolve around Dirk Gently, a “holistic detective” who solved crimes by relying upon “the fundamental interconnectedness of all things” in lieu of more conventional clues. “They will be recognizably me but radically different, at least from my point of view,” he said. “The story is based on here and now, but the explanation turns out to be science fiction.”

Adams’s enthusiasm for both projects was no doubt authentic when he conceived them, but it dissipated quickly when the time came to follow through, setting a pattern that would persist for the rest of his life. He went completely AWOL on Infocom, leaving them stuck with a project they had never really wanted in the first place. It was finally agreed that Adams’s best mate, a fellow writer named Michael Bywater, would come in and ghost-write Bureaucracy on his behalf. And this Bywater did, making a pretty good job of it, all things considered. (As for the proper Hitchhiker’s sequel which a struggling Infocom did want to make very badly: that never happened at all, although Adams caused consternation and confusion for a while on both side of the Atlantic by proposing that he and Infocom collaborate on it with a third party with which he had become enamored, the British text-adventure house Magnetic Scrolls. Perhaps fortunately under these too-many-cooks-in-the-kitchen circumstances, his follow-through here was no better than it had been on Bureaucracy, and the whole project died quietly after Infocom was shut down in 1989.)

Dirk Gently was a stickier wicket, thanks to the amount of money that Adams’s publishers had already paid for the books. They got them out of him at last using the same method that had done the trick for So Long, and Thanks for All the Fish: locking him in a room with a minder and not letting him leave until he had produced a novel. Dirk Gently’s Holistic Detective Agency was published in 1987, its sequel The Long Dark Tea-Time of the Soul the following year. The books had their moments, but fell a little flat for most readers. In order to be fully realized, their ambitious philosophical conceits demanded an attention to plotting and construction that was not really compatible with being hammered out under duress in a couple of weeks. They left Adams’s old fans nonplussed in much the same way that So Long… had done, whilst failing to break him out of the science-fiction ghetto in which he felt trapped. Having satisfied his contractual obligations in that area, he would never complete another Dirk Gently novel.

Then, the same year that the second Dirk Gently book was published, Adams stumbled into the most satisfying non-Hitchhiker’s project of his life. A few years earlier, during a jaunt to Madagascar, he had befriended a World Wildlife Fund zoologist named Mark Carwardine, who had ignited in him a passion for wildlife conservation. Now, the two hatched a scheme for a radio series and an accompanying book that would be about as different as they possibly could from the ones that had made Adams’s name: the odd couple would travel to exotic destinations in search of rare and endangered animal species and make a chronicle of what they witnessed and underwent. Carwardine would be the expert and the straight man, Adams the voice of the interested layperson and the comic relief. They would call the project Last Chance to See, because the species they would be seeking out might literally not exist anymore in just a few years. To his credit, Adams insisted that Carwardine be given an equal financial and creative stake. “We spent many evenings talking into the night,” remembers the latter. “I’d turn up with a list of possible endangered species, then we’d pore over a world map and talk about where we’d both like to go.”

They settled on the Komodo dragon of Indonesia, the Rodrigues flying fox of Mauritius, the baiji river dolphin of China, the Juan Fernández fur seal of South America’s Pacific coast, the mountain gorilla and northern white rhinoceros of East Africa, the kākāpō of New Zealand, and the Amazonian manatee of Brazil. Between July of 1988 and April of 1989, they traveled to all of these places — often as just the two of them, without any additional support staff, relying on Adams’s arsenal of gadgets to record the sights and especially the sounds. Adams came home 30 pounds lighter and thoroughly energized, eager to turn their adventures into six half-hour programs that were aired on BBC Radio later that year.

Mark Carwardine and Douglas Adams in the Juan Fernández Islands.

The book proved predictably more problematic. It was not completed on schedule, and was in a very real sense not even completed at all when it was wrenched away from its authors and published in 1990; the allegedly “finished” volume covers only five of the seven expeditions, and one of those in a notably more cursory manner than the others. Nevertheless, Adams found the project as a whole a far more enjoyable experience than the creation of his most recent novels had been. He had a partner to bounce ideas off of, making the business that much less lonely. And he wasn’t forced to invent any complicated plots from whole cloth, something for which he had arguably never been very well suited. He could just inhale his surroundings and exhale them again for the benefit of his readers, with a generous helping of the droll wit and the altogether unique perspective he could place on things. His descriptions of nature and animal life were often poignant and always delightful, as were those of the human societies he and Carwardine encountered. “Because I had an external and important subject to deal with,” mused Adams, “I didn’t feel any kind of compulsion to be funny the whole time — and oddly enough, a lot of people have said it’s the funniest book I’ve written.”

An example, on the subject of traffic in the fast-rising nation of China, which the pair visited just six months before the massacre on Tiananmen Square showed that its rise would take place on terms strictly dictated by the Communist Party:

Foreigners are not allowed to drive in China, and you can see why. The Chinese drive, or cycle, according to laws that are simply not apparent to an uninitiated observer, and I’m thinking not merely of the laws of the Highway Code; I’m thinking of the laws of physics. By the end of our stay in China, I had learnt to accept that if you are driving along a two-lane road behind another car or truck, and there are two vehicles speeding towards you, one of which is overtaking the other, the immediate response of your driver will be to also pull out and overtake. Somehow, magically, it all works out in the end.

What  I could never get used to, however, was this situation: the vehicle in front of you is overtaking the vehicle in front of him, and your driver pulls out and overtakes the overtaking vehicle, just as three other vehicles are coming towards you performing exactly the same manoeuvre. Presumably Sir Isaac Newton has long ago been discredited as a bourgeois capitalist running-dog lackey.

Adams insisted to the end of his days that Last Chance to See was the best thing he had ever written, and I’m not at all sure that I disagree with him. On the contrary, I find myself wishing that he had continued down the trail it blazed, leaving the two-headed aliens behind in favor of becoming some combination of humorist, cultural critic, and popular-science writer. “I’m full of admiration for people who make science available to the intelligent layperson,” he said. “Understanding what you didn’t before is, to me, one of the greatest thrills.” Douglas Adams could easily have become one of those people whom he so admired. It seems to me that he could have excelled in that role, and might have been a happier, more satisfied man in it to boot. But it didn’t happen, for one simple reason: as well as taking a spot in the running for the title of best book he had ever written, Last Chance to See became the single worst-selling one. Adams:

Last Chance to See was a book I really wanted to promote as much as I could because the Earth’s endangered species is a huge topic to talk about. The thing I don’t like about doing promotion usually is that you have to sit there and whinge on about yourself. But here was a big issue I really wanted to talk about, and I was expecting to do the normal round of press, TV, and radio. But nobody was interested. They just said, “It isn’t what he normally does, so we’ll pass on this, thank you very much.” As a result, the book didn’t do very well. I had spent two years and £150,000 of my own money doing it. I thought it was the most important thing I’d ever done, and I couldn’t get anyone to pay any attention.

Now, we might say at this point that there was really nothing keeping Adams from doing more projects like Last Chance to See. Financially, he was already set for life, and it wasn’t as if his publishers were on the verge of dropping him. He could have accepted that addressing matters of existential importance aren’t always the best way to generate high sales, could have kept at it anyway. In time, perhaps he could have built a whole new audience and authorial niche for himself.

Yet all of that, while true enough on the face of it, fails to address just how difficult it is for anyone who has reached the top of the entertainment mountain to accept relegation to a base camp halfway down its slope. It’s the same phenomenon that today causes Adams’s musical hero and former dinner-party guest Paul McCartney, who is now more than 80 years old, to keep trying to score one more number-one hit instead of just making the music that pleases him. Once you’ve tasted mass adulation, modest success can have the same bitter tang as abject failure. There are artists who are so comfortable in their own skin, or in their own art, or in their own something, that this truism does not apply. But Douglas Adams, a deeply social creature who seemed to need the approbation of fans and peers as much as he needed food and drink, was not one of them.

So, he retreated to his own comfort zone and wrote another Hitchhiker’s novel. At first it was to be called Starship Titanic, but then it became Mostly Harmless. The choice to name it after one of the oldest running gags in the Hitchhiker’s series was in some ways indicative; this was to be very much a case of trotting out the old hits for the old fans. The actual writing turned into the usual protracted war between Adams’s publisher and the author himself, who counted as his allies in the cause of procrastination the many shiny objects that were available to distract a wealthy, intellectually curious social butterfly such as him. This time he had to be locked into a room with not only a handler from his publisher but his good friend Michael Bywater, who had, since doing Bureaucracy for Infocom, fallen into the role of Adams’s go-to ghostwriter for many of the contracts he signed and failed to follow through on. Confronted with the circumstances of its creation, one is immediately tempted to suspect that substantial chunks of Mostly Harmless were actually Bywater’s work. By way of further circumstantial evidence, we might note that some of the human warmth that marked the first four Hitchhiker’s novels is gone, replaced by a meaner, archer style of humor that smacks more of Bywater than the Adams of earlier years.

It’s a strange novel — not a very good one, but kind of a fascinating one nonetheless. Carl Jung would have had a field day with it as a reflection of its author’s tortured relationship to the trans-media franchise he had spawned. There’s a petulant, begrudging air to the thing, right up until it ends in the mother of all apocalypses, as if Adams was trying to wreck his most famous creation so thoroughly that he would never, ever be able to heed its siren call again. “The only way we could persuade Douglas to finish Mostly Harmless,” says Michael Bywater, “was [to] offer him several convincing scenarios by which he could blow up not only this Earth but all the Earths that may possibly exist in parallel universes.” That was to be that, said Adams. No more Hitchhiker’s, ever; he had written the franchise into a black hole from which it could never emerge. Which wasn’t really true at all, of course. He would always be able to find some way to bring the multidimensional Earth back in the future, should he decide to, just as he had once brought the uni-dimensional Earth back from its destruction in the very first novel. Such is the advantage of being god of your own private multiverse. Indeed, there are signs that Adams was already having second thoughts before he even allowed Mostly Harmless to be sent to the printer. At the last minute, he sprinkled a few hints into the text that the series’s hero Arthur Dent may in fact have survived the apocalypse. It never hurts to hedge your bets.

Published in October of 1992, Mostly Harmless sold better than Last Chance to See or the Dirk Gently novels, but not as well as the golden-age Hitchhiker’s books. Even the series’s most zealous fans could smell the ennui that fairly wafted up from its pages. Nevertheless, they would have been shocked if you had told them that Douglas Adams, still only 40 years old, would never finish another book.

The next several years were the least professionally productive of Adams’s adult life to date. This wasn’t necessarily a bad thing; there is, after all, more to life than one’s career. He had finally married his longtime off-and-on romantic partner Jane Belson in 1991, and in 1994, when the husband’s age was a thoroughly appropriate 42, the couple had their first and only child. When not doting on his baby daughter Polly, Adams amused himself with his parties and his hobbies, which mostly involved his beloved Apple Macintosh computers and, especially, music. He amassed what he believed to be the largest collection of left-handed guitars in the world. His friend David Gilmour gave him his best birthday gift ever when he allowed him to come onstage and play one of those guitars with Pink Floyd for one song on their final tour. Adams also performed as one half of an acoustic duo at an American Booksellers’ Association Conference; the duo’s other half was the author Ken Follett. He even considered trying to make an album of his own: “It will basically be something very similar to Sgt. Pepper, I should think.” Let it never be said that Douglas Adams didn’t aim high in his flights of fancy…

Adams gives his daughter Polly some early musical instruction.

With Adams thus absent from the literary scene, his position as genre fiction’s premiere humorist was seized by Terry Pratchett, whose first Discworld novels of the mid-1980s might be not unfairly described as an attempt to ape Adams in a fantasy rather than a science-fiction setting, but who had long since come into his own. Pratchett evinced none of Adams’s fear and loathing of the actual act of writing, averaging one new Discworld novel every nine months throughout the 1990s. By way of a reward for his productivity, his wit, and his boundless willingness to take his signature series in unexpected new directions, he became the most commercially successful single British author of any sort of the entire decade.

A new generation of younger readers adored Discworld but had little if any familiarity with Hitchhiker’s. While Pratchett basked in entire conventions devoted solely to himself and his books, Adams sometimes failed to muster an audience of more than twenty when he did make a public appearance — a sad contrast to his book signings of the early 1980s, when his fans had lined up by the thousands for a quick signature and a handshake. A serialized graphic-novel adaption of Hitchhiker’s, published by DC Comics, was greeted with a collective shrug, averaging about 20,000 copies sold per issue, far below projections. Despite all this clear evidence, Adams, isolated in his bubble of rock stars and lavish parties, seemed to believe he still had the same profile he’d had back in 1983. That belief — or delusion — became the original sin of his next major creative project, which would sadly turn out to be the very last one of his life.

The genesis of Douglas Adams’s second or third computer game — depending on what you make of Bureaucracy — dates to late 1995, when he became infatuated with a nascent collective of filmmakers and technologists who called themselves The Digital Village. The artist’s colony cum corporation was the brainchild of Robbie Stamp, a former producer for Britain’s Central Television: “I was one of a then-young group of executives looking at the effects of digital technology on traditional media businesses. I felt there were some exciting possibilities opening up, in terms of people who could understand what it would mean to develop an idea or a brand across a variety of different platforms and channels.” Stamp insists that he wasn’t actively fishing for money when he described his ideas one day to Adams, who happened to be a friend of a friend of his named Richard Creasey. He was therefore flabbergasted when Adams turned to him and asked, “What would it take to buy a stake?” But he was quick on his feet; he named a figure without missing a beat. “I’m in,” said Adams. And that was that. Creasey, who had been Stamp’s boss at Central Television, agreed to come aboard as well, and the trio of co-founders was in place.

One senses that Adams was desperate to find a creative outlet that was less dilettantish than his musical endeavors but also less torturous than being locked into a room and ordered to write a book.

When I started out, I worked on radio, I worked on TV, I worked onstage. I enjoyed and experimented with different media, working with people and, wherever possible, fiddling with bits of equipment. Then I accidentally wrote a bestselling novel, and the consequence was that I had to write another and then another. After a decade or so of this, I became a little crazed at the thought of spending my entire working life in a room by myself typing. Hence The Digital Village.

The logic was sound enough when considered in the light of the kind of personality Adams was; certainly one of the reasons Last Chance to See had gone so well had been the presence of an equal partner to keep him engaged.

Still, the fact remained that it could be a little hard to figure out what The Digital Village was really supposed to be. Rejecting one of the hottest buzzwords of the age, Adams insisted that it was to be a “multiple media” company, not a “multimedia” one: “We’re producing CD-ROMs and other digital and online projects, but we’re also committed to working in traditional forms of media.” To any seasoned business analyst, that refusal to focus must have sounded like a recipe for trouble; “do one thing very, very well” is generally a better recipe for success in business than the jack-of-all-trades approach. And as it transpired, The Digital Village would not prove an exception to this rule.

Their first idea was to produce a series of science documentaries called Life, the Universe, and Evolution, a riff on the title of the third Hitchhiker’s novel; that scheme fell through when they couldn’t find a television channel that was all that interested in airing it. Their next idea was to set up The Hitchhiker’s Guide to the Internet, a search engine to compete with the current king of Web searching Yahoo!; that scheme fell through when they realized that they had neither the financial resources nor the technical expertise to pull it off. And so on and so on. “We were going to be involved in documentaries, feature films, and the Internet,” says Richard Creasey regretfully. “And bit by bit they all went away. Bit by bit, we went down one avenue which was, in the nicest possible way, a disaster.”

That avenue was a multimedia adventure game, a project which would come to consume The Digital Village in more ways than one. It was embarked upon for the very simple reason that it was the only one of the founders’ ideas for which they could find adequate investment capital. At the time, the culture was living through an odd echo of the “bookware” scene of the mid-1980s, of which Infocom’s Hitchhiker’s game has gone down in history as the most iconic example. A lot of big players in traditional media were once again jumping onto the computing bandwagon with more money than sense. Instead of text and text parsers, however, Bookware 2.0 was fueled by great piles of pictures and video, sound and music, with a thin skein of interactivity to join it all together. Circa 1984, the print-publishing giant Simon & Schuster had tried very, very hard  to buy Infocom, a purchase that would have given them the Hitchhiker’s game that was then in the offing. Now, twelve years later, they finally got their consolation prize, when Douglas Adams agreed to make a game just for them. All they had to do was give him a few million dollars, an order of magnitude more than Infocom had had to put into their Hitchhiker’s.

The game was to be called Starship Titanic. Like perhaps too many Adams brainstorms of these latter days, it was a product of recycling. As we’ve already seen, the name had once been earmarked for the novel that became Mostly Harmless, but even then it hadn’t been new. No, it dated all the way back to the 1982 Hitchhiker’s novel Life, the Universe, and Everything, which had told in one of its countless digressions of a “majestic and luxurious cruise liner” equipped with a flawed prototype of an Infinite Improbability Drive, such that on its maiden voyage it had undergone “a sudden and gratuitous total existence failure.” In the game, the vessel would crash through the roof of the player’s ordinary earthly home; what could be more improbable than that? Then the player would be sucked aboard and tasked with repairing the ship’s many wildly, bizarrely malfunctioning systems and getting it warping through hyperspace on the straight and narrow once again. Whether Starship Titanic exists in the same universe — or rather multiverse — as Hitchhiker’s is something of an open question. Adams was never overly concerned with such fussy details of canon; his most devoted fans, who very much are, have dutifully inserted it into their Hitchhiker’s wikis and source books on the basis of that brief mention in Life, the Universe, and Everything.

Adams was often taken by a fit of almost manic enthusiasm when he first conceived of a new project, and this was definitely true of Starship Titanic. He envisioned another trans-media property to outdo even Hitchhiker’s in its prime. Naturally, there would need to be a Starship Titanic novel to accompany the game. Going much further, Adams pictured his new franchise fulfilling at last his fondest unrequited dream for Hitchhiker’s. “I’m not in a position to make any sort of formal announcement,” he told the press cagily, “but I very much hope that it will have a future as a movie as well.” There is no indication that any of the top-secret Hollywood negotiations he was not-so-subtly hinting at here ever took place.

In their stead, just about everything that could possibly go wrong with the whole enterprise did so. It became a veritable factory for resentments and bad feelings. Robbie Stamp and Richard Creasey, who didn’t play games at all and weren’t much interested in them, were understandably unhappy at seeing their upstart new-media collective become The Douglas Adams Computer Games Company. This created massive dysfunction in the management ranks.

Predictably enough, Adams brought in Michael Bywater to help him when his progress on the game’s script stalled out. Indeed, just as is the case with Mostly Harmless, it’s difficult to say where Douglas Adams stops and Michael Bywater begins in the finished product. In partial return for his services, Bywater believed that his friend implicitly or explicitly promised that he could write and for once put his own name onto the Starship Titanic novel. But this didn’t happen in the end. Instead Adams sourced it out to Robert Sheckley, his favorite old-school science-fiction writer, who was in hard financial straits and could use the work. When Sheckley repaid his charity with a manuscript that was so bad as to be unpublishable, Adams bypassed Bywater yet again, giving the contract to another friend, the Monty Python alum Terry Jones, who also did some voice acting in the game. Bywater was incensed by this demonstration of exactly where he ranked in Adams’s entourage; it seemed he was good enough to become the great author’s emergency ghostwriter whenever his endemic laziness got him into a jam, but not worthy of receiving credit as a full-fledged collaborator. The two parted acrimoniously; the friendship, one of the longest and closest in each man’s life, would never be fully mended.

And all over a novel which, under Jones’s stewardship, came out tortuously, exhaustingly unfunny, the very essence of trying way too hard.

“Where is Leovinus?” demanded the Gat of Blerontis, Chief Quantity Surveyor of the entire North Eastern Gas District of the planet of Blerontin. “No! I do not want another bloody fish-paste sandwich!”

He did not exactly use the word “bloody” because it did not exist in the Blerontin language. The word he used could be more literally translated as “similar in size to the left earlobe,” but the meaning was much closer to “bloody.” Nor did he actually use the phrase “fish paste,” since fish do not exist on Blerontin in the form in which we would understand them to be fish. But when one is translating from a language used by a civilisation of which we know nothing, located as far away as the centre of the galaxy, one has to approximate. Similarly, the Gat of Blerontis was not exactly a “Quantity Surveyor,” and certainly the term “North Eastern Gas District” gives no idea at all about the magnificence and grandeur of his position. Look, perhaps I’d better start again…

Oh, my. Yes, Terry, perhaps you should. Whatever else you can say about Michael Bywater, he at least knew how to ape Douglas Adams without drenching the page in flop sweat.

The novel came out in December of 1997, a few months before the game, sporting on its cover the baffling descriptor Douglas Adams’s Starship Titanic by Terry Jones. In a clear sign that Bookware 2.0 was already fading into history alongside its equally short-lived predecessor, Simon & Schuster gave it virtually no promotion. Those critics who deigned to notice it at all savaged it for being exactly what it was, a slavishly belabored third-party imitation of a set of tired tropes. Adams and Jones did a short, dispiriting British book tour together, during which they were greeted with half-empty halls and bookstores; those fans who did show up were more interested in talking about the good old days of Hitchhiker’s and Monty Python than Starship Titanic. It was not a positive omen for the game.

At first glance, said game appears to be a typical product of the multimedia-computing boom, when lots and lots of people with a lot of half-baked highfalutin ideas about the necessary future of games suddenly rushed to start making them, without ever talking to any of the people who had already been making them for years or bothering to try to find out what the ingredients of a good, playable game might in fact be. Once you spend just a little bit of time with Starship Titanic, however, you begin to realize that this rush to stereotype it has done it a disservice. It is in reality uniquely awful.

From Myst and its many clones, it takes its first-person perspective and its system of navigation, in which you jump between static, pre-rendered nodes in a larger contiguous space. That approach is always a little unsatisfactory even at its best — what you really want to be doing is wandering through a seamless world, not hopping between nodes — but Starship Titanic manages to turn the usual Mysty frustrations into a Gordian Knot of agony. The amount of rotation you get when you click on the side of the screen to turn the view is wildly inconsistent from node to node and turn to turn, even as the views themselves seem deliberately chosen to be as confusing as possible. This is the sort of game where you can find yourself stuck for hours because you failed to spot… no, not some tiny little smear of pixels on the floor representing some obscure object, but an entire door that can only be seen from one fiddly angle. Navigating the spaceship is the Mount Everest of fake difficulties — i.e., difficulties that anyone who was actually in this environment would not be having.

Myst clones usually balance their intrinsic navigational challenges with puzzles that are quite rigorously logical, being most typically of the mechanical stripe: experiment with the machinery to deduce what each button and lever does, then apply the knowledge you gain to accomplish some task. But not Starship Titanic. It relies on the sort of moon logic that’s more typical of the other major strand of 1990s adventure game, those that play out from a third-person perspective and foreground plot, character interaction, and the player’s inventory of objects to a much greater degree. Beyond a certain point, only the “try everything on everything” method will get you anywhere in Starship Titanic. This is made even more laborious by an over-baked interface in which every action takes way more clicks than it ought to. Like everything else about the game, the interface too is wildly inconsistent; sometimes you can interact with things in one way, sometimes in another, with no rhyme or reason separating the two. You just have to try everything every which way, and maybe at some point something works.

Having come this far, but still not satisfied with merely having combined the very worst aspects of the two major branches of contemporary adventure games, Douglas Adams looked to the past for more depths to plumb. At his insistence, Starship Titanic includes, of all things, a text parser — a text parser just as balky and obtuse as most of the ones from companies not named Infocom back in the early 1980s. It rears its ugly head when you attempt to converse with the robots who are the ship’s only other inhabitants. The idea is that you can type what you want to say to them in natural language, thereby to have real conversations with them. Alas, the end result is more Eliza than ChatGPT. The Digital Village claimed to have recorded sixteen hours of voiced responses to your conversational sallies and inquiries. This sounds impressive — until you start to think about what it means to try to pack coherent responses to literally anything in the world the player might possibly say to a dozen or so possible interlocutors into that span of time. What you get out on the other end is lots and lots of variations on “I don’t understand that,” when you’re not being blatantly misunderstood by a parser that relies on dodgy pattern matching rather than any thoroughgoing analysis of sentence structure. Nothing illustrates more cogently how misconceived and amateurish this whole project was; these people were wasting time on this nonsense when the core game was still unplayable. Adams, who had been widely praised for stretching the parser in unusual, slightly postmodern directions in Infocom’s Hitchhiker’s game, clearly wanted to recapture that moment here. But he had no Steve Meretzky with him this time — no one at all who truly understood game design — to corral his flights of imagination and channel them into something achievable and fun. It’s a little sad to see him so mired in an unrecoverable past.

But if the parser is weird and sad, the weirdest and saddest thing of all about Starship Titanic is how thoroughly unfunny it is. Even a compromised, dashed-off Adams novel like Mostly Harmless still has moments which can make you smile, which remind you that, yes, this is Douglas Adams you’re reading. Starship Titanic, on the other hand, is comprehensively tired and tiring, boiling Adams’s previous oeuvre down to its tritest banalities — all goofy robots and aliens, without the edge of satire and the cock-eyed insights about the human condition that mark Hitchhiker’s. Was Adams losing his touch as a humorist? Or did his own voice just get lost amidst those of dozens of other people trying to learn on the fly how to make a computer game? It’s impossible to say. It is pretty clear, however, that he had one foot out the door of the project long before it was finished. “In the end, I think he felt quite distanced from it,” says Robbie Stamp of his partner. That sentiment applied equally to all three co-founders of the The Digital Village, who couldn’t fully work out just how their dreams and schemes had landed them here. In a very real way, no one involved with Starship Titanic actually wanted to make it.

I suppose it’s every critic’s duty to say something kind about even the worst of games. In that spirit, I’ll note that Starship Titanic does look very nice, with an Art Deco aesthetic that reminds me slightly of a far superior adventure game set aboard a moving vehicle, Jordan Mechner’s The Last Express. If nothing else, this demonstrates that The Digital Village knew where to find talented visual artists, and that they were sophisticated enough to choose a look for their game and stick to it. Then, too, the voice cast the creators recruited was to die for, including not only Terry Jones and Douglas Adams himself but even John Cleese, who had previously answered every inquiry about appearing in a game with some variation of “Fuck off! I don’t do games!” The music was provided by Wix Wickens, the keyboardist and musical director for Paul McCartney’s touring band. What a pity that no one from The Digital Village had a clue what to do with their pile of stellar audiovisual assets. Games were “an area about which we knew nothing,” admits Richard Creasey. That went as much for Douglas Adams as any of the rest of them; as Starship Titanic’s anachronistic parser so painfully showed, his picture of the ludic state of the art was more than a decade out of date.




Begun in May of 1996, Starship Titanic shipped in April of 1998, more than six months behind schedule. Rather bizarrely, no one involved seems ever to have considered explicitly branding it as a Hitchhiker’s game, a move that would surely have increased its commercial potential at least somewhat. (There was no legal impediment to doing so; Adams owned the Hitchhiker’s franchise outright.) Adams believed that his name on the box alone could make it a hit. Some of those around him were more dubious. “I think it was a harsh reality,” says Robbie Stamp, “that Douglas hadn’t been seen to figure big financially by anyone for a little while.” But no one was eager to have that conversation with him at the time.

So, Starship Titanic was sent out to greet an unforgiving world as its own, self-contained thing, and promptly stiffed. Even the fortuitous release the previous December of James Cameron’s blockbuster film Titanic, which had elevated another adventure game of otherwise modest commercial prospects to million-seller status, couldn’t save this one. Many of the gaming magazines and websites didn’t bother to review it at all, so 1996 did it feel in a brave new world where first-person shooters and real-time strategies were all the rage. Of those that did, GameSpot’s faint praise is typically damning: “All in all, Starship Titanic is an enjoyable tribute to an older era of adventure gaming. It feels a bit empty at times, but Douglas Adams fans and text-adventurers will undoubtedly be able to look past its shortcomings.” This is your father’s computer game, in other words. But leave it to Charles Ardai of Computer Gaming World magazine to deliver a zinger worthy of Adams himself: he called Starship Titanic a “Myst opportunity.”

One of the great ironies of this period is that, at the same time Douglas Adams was making a bad science-fiction-comedy adventure game, his erstwhile Infocom partner Steve Meretzky was making one of his own, called The Space Bar. Released the summer before Starship Titanic, it stiffed just as horribly. Perhaps if the two had found a way to reconnect and combine their efforts, they could have sparked the old magic once again.

As it was, though, Adams was badly shaken by the failure of Starship Titanic, the first creative product with his name on it to outright lose its backers a large sum of money. “Douglas’s fight had gone out of him,” says Richard Creasey. Adams found a measure of solace in blaming the audience — never an auspicious posture for any creator to adopt, but needs must. “What we decided to do in this game was go for the non-psychopath sector of the market,” he said. “And that was a little hubristic because there really isn’t a non-psychopath sector of the market.” The 1.5 million people who were buying the non-violent Myst sequel Riven at the time might have begged to differ.

Luckily, Adams had something new to be excited about: in late 1997, he had signed a development deal with Disney for a “substantial” sum of money — a deal that would, if all went well, finally lead to his long-sought Hitchhiker’s film. Wanting to be close to the action and feeling that he needed a change of scenery, he opted to pull up stakes from the Islington borough of London where he had lived since 1980 and move with his family to Los Angeles. A starry-eyed Adams was now nursing dreams of Hugh Laurie or Hugh Grant as Arthur Dent, Jim Carrey as the two-headed Zaphod Beeblebrox.

The rump of The Digital Village which he left behind morphed into h2g2, an online compendium of user-generated knowledge, an actually extant version of the fictional Hitchhiker’s Guide to the Galaxy. If you’re thinking that sounds an awful lot like Wikipedia, you’re right; the latter site, which was launched two years after h2g2 made its debut in 1999, has thoroughly superseded it today. In its day, though, h2g2 was a genuinely visionary endeavor, an early taste of the more dynamic, interactive Web 2.0 that would mark the new millennium. Adams anticipated the way we live our digital lives today to an almost unnerving degree.

The real change takes place [with] mobile computing, and that is beginning to arrive now. We’re beginning to get Internet access on mobile phones and personal digital assistants. That creates a sea change because suddenly people will be able to get information that is appropriate to where they are and who they are — standing outside the cinema or a restaurant or waiting for a bus or a plane. Or sitting having a cup of coffee at a café. With h2g2, you can look up where you are at that moment to see what it says, and if the information is not there you can add it yourself. For example, a remark about the coffee you’re drinking or a comment that the waiter is very rude.

When not setting the agenda with prescient insights like these — he played little day-to-day role in the running of h2g2 — Adams wrote several drafts of a Hitchhiker’s screenplay and knocked on a lot of doors in Hollywood inquiring about the state of his movie, only to be politely put off again and again. Slowly he learned the hard lesson that many a similarly starry-eyed creator had been forced to learn before him: that open-ended deals like the one he had signed with Disney progress — or don’t progress — on their own inscrutable timeline.

In the meanwhile, he continued to host parties — more lavish ones than ever now after his Disney windfall — and continued being a wonderful father to his daughter. He found receptive audiences on the TED Talk circuit, full of people who were more interested in hearing his Big Ideas about science and technology than quizzing him on the minutiae of Hitchhiker’s. Anyone who asked him what else he was working on at any given moment was guaranteed to be peppered with at least half a dozen excited and exciting responses, from books to films, games to television, websites to radio, even as anyone who knew him well knew that none of them were likely to amount to much. Be that as it may, he seemed more or less happy when he wasn’t brooding over Disney’s lack of follow-through, which some might be tempted to interpret as karmic retribution for the travails he had put so many publishers and editors through over the years with his own lack of same. “I love the sense of space and the can-do attitude of Americans,” he said of his new home. “It’s a good place to bring up children.” Embracing the California lifestyle with enthusiasm, he lost weight, cut back on his alcohol consumption, and tried to give up cigarettes.

By early 2001, it looked like there was finally some movement on the Hitchhiker’s movie front. Director Jay Roach, hot off the success of Austin Powers and Meet the Parents, was very keen on it, enough so that Adams was motivated to revise the screenplay yet again to his specifications. On May 11 of that year, not long after submitting these revisions, Douglas Adams went to his local gym for his regular workout. After twenty minutes on the treadmill, he paused for a breather before moving on to stomach crunches. Seconds after sitting down on a bench, he collapsed to the floor, dead. Falling victim to another cosmic joke as tragically piquant as the brilliant writer who hates to write, his heart simply stopped beating, for no good reason that any coroner could divine. He was just 49 years old.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: The books Hitchhiker: A Biography of Douglas Adams by M.J. Simpson, Wish You Were Here: The Official Biography of Douglas Adams by Nick Webb, The Frood: The Authorised and Very Official History of Douglas Adams & The Hitchhiker’s Guide to the Galaxy by Jem Roberts, The Ultimate Hitchhiker’s Guide to the Galaxy by Douglas Adams, Last Chance to See by Douglas Adams and Mark Carwardine, and Douglas Adams’s Starship Titanic by Terry Jones; Computer Gaming World of September 1998.

Online sources include Gamespot’s vintage review of Starship Titanic, an AV Club interview with Adams from January of 1998, “The Making of Starship Titanic from Adams’s website, The Digital Village’s website (yes, it still exists), and a Guardian feature on Thomas Harris.

Starship Titanic is available for digital purchase on GOG.com.

 

Tags: , ,

The Last Days of Zork

If you follow the latest developments in modern gaming even casually, as I do, you know that Microsoft and Activision Blizzard recently concluded the most eye-watering transaction ever to take place in the industry: the former acquired the latter for a price higher than the gross national product of more than half of the world’s countries. I find it endlessly amusing to consider that Activision may have lived long enough to set that record only thanks to Infocom, that humble little maker of 1980s text adventures, whose annual revenues — revenues, mind you, not profits — never exceeded $10 million before Activision acquired it in 1986. And just how did this David save a Goliath? It happened like this:

After Bobby Kotick arranged a hostile takeover of a bankrupt and moribund Activision in 1991, he started rummaging through its archives, looking for something that could start bringing some money in quickly, in order to keep the creditors who were howling at his door at bay for a wee bit longer. He came upon the 35 text adventures which had been made by Infocom over the course of the previous decade, games which, for all that they were obviously archaic by the standards of the encroaching multimedia age, were still fondly remembered by many gamers as the very best of their breed. He decided to take a flier on them, throwing twenty of them onto one of those shiny new CD-ROMS that everyone was talking about — or, if that didn’t work for you, onto a pile of floppy disks that rattled around in the box like ice cubes in a pitcher of lemonade. Then he photocopied the feelies and hint books that had gone with the games, bound them all together into two thick booklets, and stuck those in the box as well. He called the finished collection, one of the first notable examples of “shovelware” in gaming, The Lost Treasures of Infocom.

It sold 100,000 or more units, at $60 or $70 a pop and with a profit margin to die for. The inevitable Lost Treasures II that followed, collecting most of the remaining games,[1]The CD-ROM version included fourteen games, missing only Leather Goddesses of Phobos, which Activision attempted to market separately on the theory that sex sells itself. The floppy version included eleven games, lacking additionally three of Infocom’s late illustrated text adventures. was somewhat less successful, but still more than justified the (minimal) effort that had gone into its curation. The two products’ combined earnings were indeed enough to give pause to those creditors who had been pushing for the bankrupt company to be liquidated rather than reorganized.

With a modicum of breathing room thus secured, Kotick scraped together every penny he could find for his Hail Mary pass, which was once again to rely upon Infocom’s legacy. William Volk, his multimedia guru in residence, oversaw the production of Return to Zork, a splashy graphical adventure with all the cutting-edge bells and whistles. In design terms, it was an awful game, riddled with nonsensical puzzles and sadistic dead ends. Yet that didn’t matter at all in the marketplace. Return to Zork rammed the zeitgeist perfectly by combining lingering nostalgia for Zork, Infocom’s best-selling series of games, with all of the spectacular audiovisual flash the new decade could offer up. Upon its release in late 1993, it sold several hundred thousand copies as a boxed retail product, and even more as a drop-in with the “multimedia upgrade kits” (a CD-ROM drive and a sound card in one convenient package!) that were all the rage at the time. It left Activision, if not quite in rude health yet, at least no longer on life support. “Zork on a brick would sell 100,000 copies,” crowed Bobby Kotick.

With an endorsement like that from the man at the top, a sequel to Return to Zork seemed sure to follow. Yet it proved surprisingly long in coming. Partly this was because William Volk left Activision just after finishing Return to Zork, and much of his team likewise scattered to the four winds. But it was also a symptom of strained resources in general, and of currents inside Activision that were pulling in two contradictory directions at once. The fact was that Activision was chasing two almost diametrically opposing visions of mainstream gaming’s future in the mid-1990s, one of which would show itself in the end to have been a blind alley, the other of which would become the real way forward.

Alas, it was the former that was exemplified by Return to Zork, with its human actors incongruously inserted over computer-generated backgrounds and its overweening determination to provide a maximally “cinematic” experience. This vision of “Siliwood” postulated that the games industry would become one with the movie and television industry, that name actors would soon be competing for plum roles in games as ferociously as they did for those in movies; it wasn’t only for the cheaper rents that Kotick had chosen to relocate his resuscitated Activision from Northern to Southern California.

The other, ultimately more sustainable vision came to cohabitate at the new Activision almost accidentally. It began when Kotick, rummaging yet again through the attic full of detritus left behind by his company’s previous incarnation, came across a still-binding contract with FASA for the digital rights to BattleTech, a popular board game of dueling robot “mechs.” After a long, troubled development cycle that consumed many of the resources that might otherwise have been put toward a Return to Zork sequel, Activision published MechWarrior 2: 31st Century Combat in the summer of 1995.

Mechwarrior 2 was everything Return to Zork wasn’t. Rather than being pieced together out of canned video clips and pre-rendered scenes, it was powered by 3D graphics that were rendered in real time. It was exciting in a viscerally immersive, action-oriented way rather than being a passive spectacle. And, best of all in the eyes of many of its hyper-competitive players, it was multiplayer-friendly. This, suffice to say, was the real future of mainstream hardcore computer gaming. MechWarrior 2′s one similarity with Return to Zork was external to the game itself: Kotick once again pulled every string he could to get it included as a pack-in extra with hardware-upgrade kits. This time, however, the upgrades in question were the new 3D-graphics accelerators that made games like this one run so much better.

In a way, the writing was on the wall for Siliwood at Activision as soon as MechWarrior 2 soared to the stratosphere, but there were already a couple of ambitious projects in the Siliwood vein in the works at that time, which together would give the alternative vision’s ongoing viability a good, solid test. One of these was Spycraft, an interactive spy movie with unusually high production values and high thematic ambitions to go along with them: it was shot on film rather than the standard videotape, from a script written with the input of William Colby and Oleg Kalugin, American and Soviet spymasters during the Cold War. The other was Zork Nemesis.



Whatever else you can say about it, you can’t accuse Zork Nemesis of merely aping its successful predecessor. Where Return to Zork is goofy, taking its cues from the cartoon comedies of Sierra and LucasArts as well as the Zork games of Infocom, Zork Nemesis is cold and austere — almost off-puttingly so, like its obvious inspiration Myst. Then, too, in place of the abstracted room-based navigation of Return to Zork, Zork Nemesis gives you more granular nodes to jump between in an embodied, coherent three-dimensional space, again just like Myst. Return to Zork is bursting with characters, such as that “Want some rye?” guy who became an early Internet meme unto himself; Zork Nemesis is almost entirely empty, its story playing out through visions, written records, and brief snatches of contact across otherwise impenetrable barriers of time and space.

Which style of adventure game you prefer is a matter of taste. In at least one sense, though, Zork Nemesis does undeniably improve upon its predecessor. Whereas Return to Zork’s puzzles seem to have been slapped together more or less at random by a team not overly concerned with the player’s sanity or enjoyment, it’s clear that Zork Nemesis was consciously designed in all the ways that the previous Zork was not; its puzzles are often hard, but they’re never blatantly unfair. Nor do they repeat Return to Zork’s worst design sin of all: they give you no way of becoming a dead adventurer walking without knowing it.

The plot here involves a ruthless alchemical mastermind, the Nemesis of the title, and his quest for a mysterious fifth element, a Quintessence that transcends the standard Earth, Air, Fire, and Water. The game is steeped in the Hermetic occultism that strongly influenced many of the figures who mark the transition from Medieval to Modern thought in our own world’s history, from Leonardo da Vinci to Isaac Newton. This is fine in itself; in fact, it’s a rather brilliant basis for an adventure game if you ask me, easily a more interesting idea in the abstract than yet another Zork game. The only problem — a problem which has been pointed out ad nauseam over the years since Zork Nemesis’s release — is that this game does purport to be a Zork game in addition to being about all that other stuff, and yet it doesn’t feel the slightest bit like Zork. While the Zork games of Infocom were by no means all comedy all the time — Zork III in particular is notably, even jarringly austere, and Spellbreaker is not that far behind it — they never had anything to do with earthly alchemy.

I developed the working theory as I played Zork Nemesis that it must have been originally conceived as simply a Myst-like adventure game, having nothing to do with Zork, until some marketing genius or other insisted that the name be grafted on to increase its sales potential. I was a little sad to be disabused of my pet notion by Laird Malamed, the game’s technical director, with whom I was able to speak recently. He told me that Zork Nemesis really was a Zork from the start, to the point of being listed as Return to Zork II in Activision’s account books before it was given its final name. Nevertheless, I did find one of his choices of words telling. He said that Cecilia Barajas, a former Los Angeles district attorney who became Zork Nemesiss mastermind, was no more than “familiar” with Infocom’s Zork. So, it might not be entirely unfair after all to say that the Zork label on Zork Nemesis was more of a convenient way for Barajas to make the game she wanted to make than a wellspring of passion for her. Please don’t misunderstand me; I don’t mean for any of the preceding to come across as fannish gatekeeping, something we have more than enough of already in this world. I’m merely trying to understand, just as you presumably are, why Zork Nemesis is so very different from the Activision Zork game before it (and also the one after it, about which more later).

Of course, a game doesn’t need to be a Zork to be good. And indeed, if we forget about the Zork label, we find that Nemesis (see what I did there?) is one of the best — arguably even the best — of all the 1990s “Myst clones.” It’s one of the rare old games whose critical reputation has improved over the years, now that the hype surrounding its release and the angry cries of “But it’s not a Zork!” have died away, granting us space to see it for what it is rather than what it is not. With a budget running to $3 million or more, this was no shoestring project. In fact, the ironic truth is that both Nemesis’s budget and its resultant production values dramatically exceed those of its inspiration Myst. Its principal technical innovation, very impressive at the time, is the ability to smoothly scroll through a 360-degree panorama in most of the nodes you visit, rather than being limited to an arbitrary collection of fixed views. The art direction and the music are superb, maintaining a consistently sinister, occasionally downright macabre atmosphere. And it’s a really, really big game too, far bigger than Myst, with, despite its almost equally deserted environments, far more depth to its fiction. If we scoff just a trifle because this is yet one more adventure game that requires you to piece together a backstory from journal pages rather than living a proper foreground story of your own, we also have to acknowledge that the backstory is interesting enough that you want to find and read said pages. This is a game that, although it certainly doesn’t reinvent any wheels, implements every last one of them with care.

My own objections are the same ones that I always tend to have toward this sub-genre, and that thus probably say more about me than they do about Nemesis. The oppressive atmosphere, masterfully inculcated though it is, becomes a bit much after a while; I start wishing for some sort of tonal counterpoint to this all-pervasively dominant theme, not to mention someone to actually talk to. And then the puzzles, although not unfair, are sometimes quite difficult — more difficult than I really need them to be. Nemesis is much like Riven, Myst’s official sequel, in that it wants me to work a bit harder for my fun than I have the time or energy for at this point in my life. Needless to say, though, your mileage may vary.


Zork Nemesis’s story is told through ghostly (and non-interactive) visions…

…as well as through lots of books, journals, and letters. Myst fans will feel right at home.

The puzzles too are mostly Myst-style set-pieces rather than relying on inventory objects.

The macabre atmosphere becomes downright gruesome in places.

Venus dispenses hints if you click on her. What is the ancient Roman goddess of love, as painted by the seventeenth-century Spanish master Diego Velázquez, doing in the world of Zork? Your guess is as good as mine. Count it as just one more way in which this Zork can scarcely be bothered to try to be a Zork at all.



Released on the same day in April of 1996 as Spycraft, Activision’s other big test of the Siliwood vision’s ongoing viability, Zork Nemesis was greeted with mixed reviews. This was not surprising for a Myst clone, a sub-genre that the hardcore-gaming press never warmed to. Still, some of the naysayers waxed unusually vitriolic upon seeing such a beloved gaming icon as Zork sullied with the odor of the hated Myst. The normally reliable and always entertaining Charles Ardai of Computer Gaming World, the print journal of record for the hobby, whose reviews could still make or break a game as a marketplace proposition even in this dawning Internet age, dinged Zork Nemesis for not having much of anything to do with Infocom’s Zork, which was fair. Yet then he went on to characterize it as a creatively bankrupt, mindless multimedia cash-in, which was not: “Give ’em a gorgeous photo-realistic environment full of fantastic landscapes, some quasi-liturgical groaning on the soundtrack, and a simple puzzle every so often to keep their brains engaged, and you’re off to the bank to count your riches. Throw in some ghostly visions and a hint of the horrific and you can snag the 7th Guest crowd too.” One can only assume from this that Ardai never even bothered to try to play the game, but simply hated it on principle. I maintain that no one who has done so could possibly describe Zork Nemesis‘s puzzles as “simple,” no matter how much smarter than I am he might happen to be.

Even in the face of headwinds like these, Zork Nemesis still sold considerably better than the more positively reviewed Spycraft, seemingly demonstrating that Bobby Kotick’s faith in “Zork on a brick” might not yet be completely misplaced. Its lifetime sales probably ended up in the neighborhood of 150,000 to 200,000 copies — not a blockbuster hit by any means, and certainly a good deal less than the numbers put up by Return to Zork, but still more than the vast majority of Myst clones, enough for it to earn back the money it had cost to make plus a little extra.[2]In my last article, about Cyan’s Riven, I first wrote that Zork Nemesis sold 450,000 copies. This figure was not accurate; I was misreading one of my sources. My bad, as I think the kids are still saying these days. I’ve already made the necessary correction there. Whereas there would be no more interactive spy movies forthcoming from Activision, Zork Nemesis did just well enough that Kotick could see grounds for funding another Zork game, as long as it was made on a slightly less lavish budget, taking advantage of the engine that had been created for Nemesis. And I’m very glad he could, because the Zork game that resulted is a real gem.



With Cecilia Barajas having elected to move on to other things, Laird Malamed stepped up into her role for the next game. He was much more than just “familiar” with Zork. He had gotten a copy of the original Personal Software “barbarian Zork — so named because of its hilariously inappropriate cover art — soon after his parents bought him his first Apple II as a kid, and had grown up with Infocom thereafter. Years later, when he had already embarked on a career as a sound designer in Hollywood, a chance meeting with Return to Zork put Activision on his radar. He applied and was hired there, giving up one promising career for another.

He soon became known both inside and outside of Activision as the keeper of the Infocom flame, the only person in the company’s senior ranks who saw that storied legacy as more than just something to be exploited commercially. While still in the early stages of making Activision’s third graphical Zork, he put together as a replacement for the old Lost Treasures of Infocom collections a new one called Classic Text Adventure Masterpieces: 33 of the canonical 35 games on a single CD, with all of their associated documentation in digital format. (The Hitchhiker’s Guide to the Galaxy and Shogun, Infocom’s only two licensed titles, were the only games missing, in both cases because their licensing contracts had expired). He did this more because he simply felt these games ought to be available than because he expected the collection to make a lot of money for his employer. In the same spirit, he reached out to the amateur interactive fiction community that was still authoring text adventures in the Infocom mold, and arranged to include the top six finishers from the recently concluded First Interactive Fiction Competition on the same disc. He searched through Activision’s storage rooms to find a backup of the old DEC mainframe Infocom had used to create its games. This he shared with Graham Nelson and a few other amateur-IF luminaries, whilst selecting a handful of interesting, entertaining, and non-embarrassing internal emails to include on the Masterpieces disc as well.[3]This “Infocom hard drive” eventually escaped the privileged hands into which it was entrusted, going on to cause some minor scandals and considerable interpersonal angst; suffice to say that not all of its contents were non-embarrassing. I have never had it in my possession. No, really, I haven’t. It’s been rendered somewhat moot in recent years anyway by the stellar work Jason Scott has done collecting primary sources for the Infocom story at archive.org. No one at Activision had ever engaged with the company’s Infocom inheritance in such an agenda-less, genuine way before him; nor would anyone do so after him.

He brought to the new graphical Zork game a story idea that had a surprisingly high-brow inspiration: the “Grand Inquisitor” tale-within-a-tale in Fyodor Dostoevsky’s 1880 novel The Brothers Karamazov, an excerpt which stands so well on its own that it’s occasionally been published that way. I can enthusiastically recommend reading it, whether you tackle the rest of the novel or not. (Laird admitted to me when we talked that he himself hadn’t yet managed to finish the entire book when he decided to use a small part of it as the inspiration for his game.) Dostoevsky’s Grand Inquisitor is a leading figure of the Spanish Inquisition, who harangues a returned Jesus Christ for his pacifism, his humility, and his purportedly naïve rejection of necessary hierarchies of power. It is, in other words, an exercise in contrast, setting the religion of peace and love that was preached by Jesus up against what it became in the hands of the Medieval Catholic popes and other staunch insitutionalists.

For its part, Zork: Grand Inquisitor doesn’t venture into quite such politically fraught territory as this. Its titular character is an ideological rather than religious tinpot dictator, of the sort all too prevalent in the 20th and 21st centuries on our world. He has taken over the town of Port Foozle, where he has banned all magic and closed all access to the Great Underground Empire that lies just beneath the town. You play a humble traveling salesperson who comes into possession of a magic lantern — a piece of highly illegal contraband in itself — that contains the imprisoned spirit of Dalboz of Gurth, the rightful Dungeon Master of the Empire. He encourages and helps you to make your way into his forbidden realm, to become a literal underground resistance fighter against the Grand Inquisitor.

The preceding paragraphs may have led you to think that Zork: Grand Inquisitor is another portentous, serious game. If so, rest assured that it isn’t. Not at all. Its tone and feel could hardly be more different from those of Zork Nemesis. Although there are some heavy themes lurking in the background, they’re played almost entirely for laughs in the foreground. This strikes me as no bad approach. There are, after all, few more devastating antidotes to the totalitarian absurdities of those who would dictate to others what sort of lives they should lead and what they should believe in than a dose of good old full-throated laughter. As Hannah Arendt understood, the Grand Inquisitors among us are defined by the qualities they are missing rather than any that they possess: qualities like empathy, conscience, and moral intelligence. We should not hesitate to mock them for being the sad, insecure, incompletely realized creatures they are.

Just as I once suspected that Zork Nemesis didn’t start out as a Zork game at all, I was tempted to assume that this latest whipsaw shift in atmosphere for Zork at Activision came as a direct response to the vocal criticisms of the aforementioned game’s lack of Zorkiness. Alas, Laird Malamed disabused me of that clever notion as well. Grand Inquisitor was, he told me, simply the Zork that he wanted to make, initiated well before the critics’ and fans’ verdicts on the last game started to pour in in earnest. He told me that he practically “begged” Margaret Stohl, who has since gone on to become a popular fantasy novelist in addition to continuing to work in games, to come aboard as lead designer and writer and help him to put his broad ideas into a more concrete form, for he knew that she possessed exactly the comedic sensibility he was going for.

Regardless of the original reason for the shift in tone, Laird and his team didn’t hesitate to describe Grand Inquisitor later in its development cycle as a premeditated response to the backlash about Nemesis’s Zork bona fides, or rather its lack thereof. This time, they told magazines like Computer Gaming World, they were determined to “let Zork be Zorky”: “to embrace what was wonderful about the old text adventures, a fantasy world with an undercurrent of humor.”

Certainly Grand Inquisitor doesn’t lack for the concrete Zorkian tropes that were also all over Return to Zork. From the white house in the forest to Flood Control Dam #3 to Dalboz’s magic lantern itself, the gang’s all here. But all of these disparate homages are integrated into a larger Zorkian tapestry in a way Activision never managed elsewhere. Return to Zork is a compromised if not cynical piece of work, its slapstick tone the result of a group of creators who saw Zork principally as a grab bag of tropes to be thrown at the wall one after another. And Nemesis, of course, has little to do with Zork at all. But Grand Inquisitor walks like a Zork, talks like a Zork, and is smart amidst its silliness in the same way as a Zork of yore. In accordance with its heritage, it’s an unabashedly self-referential game, well aware of the clichés and limitations of its genre and happy to poke fun at them. For example, the Dungeon Master here dubs you the “AFGNCAAP”: the “Ageless, Faceless, Gender-Neutral, Culturally Ambiguous Adventure Person,” making light of a longstanding debate, ancient even at the time of Grand Inquisitor’s release, over whether it must be you the player in the game or whether it’s acceptable to ask you to take control of a separate, strongly characterized protagonist.

It’s plain from first to last that this game was helmed by someone who knew Zork intimately and loved it dearly. And yet the game is never gawky in that obsessive fannish way that can be so painful to witness; it’s never so much in thrall to its inspiration that it forgets to be its own thing. This game is comfortable in its own skin, and can be enjoyed whether you’ve been steeped in the lore of Zork for decades or are coming to it completely cold. This is the way you do fan service right, folks.

Although it uses an engine made for a Myst-like game, Grand Inquisitor plays nothing like Myst. This game is no exercise in contemplative, lonely puzzle-solving; its world is alive. As you wander about, Dungeon Master Dalboz chirps up from his lantern constantly with banter, background, and subtle hints. He becomes your friend in adventure, keeping you from ever feeling too alone. In time, other disembodied spirits join you as well, until you’re wandering around with a veritable Greek chorus burbling away behind you. The voice acting is uniformly superb.

Another prominent recurring character is Antharia Jack, a poor man’s Indiana Jones who’s played onscreen as well as over the speakers by Dirk Benedict, a fellow very familiar with being a stand-in for Harrison Ford in his most iconic roles, having also played the Han Solo-wannabee Starbuck in the delightfully cheesy old television Star Wars cash-in Battlestar Galactica. Benedict, one of those actors who’s capable of portraying exactly one character but who does it pretty darn well, went on to star in The A-Team after his tenure as an outer-space fighter jockey was over. His smirking, skirt-chasing persona was thus imprinted deeply on the memories of many of the twenty-somethings whom Activision hoped to tempt into buying Grand Inquisitor. This sort of stunt-casting of actors a bit past their pop-culture prime was commonplace in productions like these, but here at least it’s hard to fault the results. Benedict leans into Antharia Jack with all of his usual gusto. You can’t help but like the guy.

When it comes to its puzzles, Grand Inquisitor’s guiding ethic is to cut its poor, long-suffering AFGNCAAP a break. All of the puzzles here are well-clued and logical within the context of a Zorkian world, the sort of puzzles that are likely to stump you only just long enough to make you feel satisfyingly smart after you solve them. There’s a nice variety to them, with plenty of the “use object X on thing Y” variety to go along with some relatively un-taxing set-piece exercises in pushing buttons or pulling levers just right. But best of all are the puzzles that you solve by magic.

Being such a dedicated Infocom aficionado, Laird Malamed remembered something that most of his colleagues probably never knew at all: that the canon of Infocom Zork games encompassed more than just the ones that had that name on their boxes, that there was also a magic-oriented Enchanter trilogy which took place in the same universe. At the center of those games was one of the most brilliant puzzle mechanics Infocom ever invented, a system of magic that had you hunting down spell scrolls to copy into your spell book, after which they were yours to cast whenever you wished. This being Infocom, however, they were never your standard-issue Dungeons & Dragons Fireball spells, but rather ones that did weirdly specific, esoteric things, often to the point that it was hard to know what they were really good for — until, that is, you finally stumbled over that one nail for which they were the perfect hammer. Grand Inquisitor imports this mechanic wholesale. Here as well, you’re forever trying to figure out how to get your hands on that spell scroll that’s beckoning to you teasingly from the top of a tree or wherever, and then, once you’ve secured it, trying to figure out where it can actually do something useful for you. This latter is no trivial exercise when you’re stuck with spells like IGRAM (“turn purple things invisible”) and KENDALL (“simplify instructions”). Naturally, much of the fun comes from casting the spells on all kinds of random stuff, just to see what happens. Following yet again in the footsteps of Infocom, Laird’s team at Activision implemented an impressive number of such interactions, useless though they are for any purpose other than keeping the AFGNCAAP amused.

Grand Inquisitor isn’t an especially long game on any terms, and the fairly straightforward puzzles mean you’ll sail through what content there is much more quickly than you might through a game like Nemesis. All in all, it will probably give you no more than three or four evenings’ entertainment. Laird Malamed confessed to me that a significant chunk of the original design document had to be cut in the end in order to deliver the game on-time and on-budget; this was a somewhat marginal project from the get-go, not one to which Activision’s bean counters were ever going to give a lot of slack. Yet even this painful but necessary surgery was done unusually well. Knowing from the beginning that the scalpel might have to come out before all was said and done, the design team consciously used a “modular” approach, from which content could be subtracted (or added, if they should prove to be so fortunate) without undermining the structural integrity, if you will, of the game as a whole. As a result of their forethought, Grand Inquisitor doesn’t feel like a game that’s been gutted. It rather feels very complete just as it is. Back in the day, when Activision was trying to sell it for $40 or $50, its brevity was nevertheless a serious disadvantage. Today, when you can pick it up in a downloadable version for just a few bucks, it’s far less of a problem. As the old showbiz rule says, better to leave ’em wanting more than wishing you’d just get off the stage already.


 

“You are standing in an open field west of a white house, with a boarded front door.” Unfortunately, the property has been condemned by the Grand Inquisitor. “Who is the boss of you? Me! I am the boss of you!”

The “spellchecker” is a good example of Grand Inquisitor’s silly but clever humor, which always has time for puns. The machine’s purpose is, as you might have guessed, to validate spell scrolls.

This subway map looks… complicated. Wouldn’t it be nice if there was a way to simplify it in a burst of magic? Laird told me that this puzzle was inspired by recollections of trying to make sense of a map of the London Underground as a befuddled tourist.

Nothing sums up the differences between Zork Nemesis and Zork: Grand Inquisitor quite so perfectly as the latter’s chess puzzle. In Nemesis, you’d be futzing around with this thing forever. And in Grand Inquisitor? As Scorpia wrote in her review for Computer Gaming World, “Think of what you’ve [always] felt like doing with an adventure-game chess puzzle, and act accordingly.”

There are some set-piece puzzles that can’t be dispatched quite so easily. An instruction booklet tells you to never, ever close all four sluices of Flood Control Dam Number 3 at once. So what do you try to do?

Playing Strip Grue, Fire, Water with Antharia Jack. The cigars were no mere affectation of Dirk Benedict. His costars complained repeatedly about the cloud of odoriferous smoke in which he was constantly enveloped. A true blue Hollywood eccentric of the old-school stripe, Benedict remains convinced to this day that the key to longevity is tobacco combined with a macrobiotic diet. Ah, well… given that he’s reached 79 years of age and counting as of this writing, it seems to be working out for him so far.

Be careful throwing around them spells, kid! Deaths in Grand Inquisitor are rendered in text. Not only is this a nice nostalgic homage to the game’s roots, it helped to maximize the limited budget by avoiding the expense of portraying all those death scenes in graphics.



Laird Malamed had no sense during the making of Grand Inquisitor that this game would mark the end of Zork’s long run. On the contrary, he had plans to turn it into the first game of a new trilogy, the beginning of a whole new era for the venerable franchise. In keeping with his determination to bring Zork back to the grass roots who knew and loved it best, he came up with an inspired guerrilla-marketing scheme. He convinced the former Infocom Implementors Marc Blank and Mike Berlyn to write up a short text-adventure prelude to the story told in Grand Inquisitor proper. Then he got Kevin Wilson, the organizer of the same Interactive Fiction Competition whose games had featured on the Masterpieces CD, to program their design in Inform, a language that compiled to the Z-Machine, Infocom’s old virtual machine, for which interpreters had long been available on countless computing platforms, both current and archaic. Activision released the end result for free on the Internet in the summer of 1997, as both a teaser for the graphical game that was to come and a proof that Zork was re-embracing its roots. Zork: The Undiscovered Underground isn’t a major statement by any means, but it stands today, as it did then, as a funny, nostalgic final glance back to the days when Zork was nothing but words on a screen.

Unfortunately, all of Laird’s plans for Zork’s broader future went up in smoke when Grand Inquisitor was released in November of 1997 and put up sales numbers well short of those delivered by Nemesis, despite reviews that were almost universally glowing this time around. Those Infocom fans who played it mostly adored it for finally delivering on the promise of its name, even if it was a bit short. The problem was that that demographic was now moving into the busiest phase of life, when careers and children tend to fill all of the hours available and then some. There just weren’t enough of those people still buying games to deliver the sales that a mass-market-focused publisher like Activision demanded, even as the Zork name meant nothing whatsoever to the newer generation of gamers who had cut their teeth on DOOM and Warcraft. Perhaps Bobby Kotick should have just written “Zork” on a brick after all, for Grand Inquisitor didn’t sell even 100,000 units.

And so, twenty years after a group of MIT graduate students had gotten together to create a game that was even better than Will Crowther and Don Woods’s Adventure, Zork’s run came to an end, taking with it any remaining dregs of faith at Activision in the Siliwood vision. Apart from one misconceived and blessedly quickly abandoned effort to revive the franchise as a low-budget MMORPG during the period when those things were sprouting like weeds, no Zork game has appeared since. We can feel sad about this if we must, but the reality is that nothing lasts forever. Far better, it seems to me, for Zork to go out with Grand Inquisitor, one of the highest of all its highs, than to be recycled again and again on a scale of diminishing returns, as has happened to some other classic gaming franchises. Likewise, I’m kind of happy that no one who made Grand Inquisitor knew they were making the very last Zork adventure. Their ignorance caused them to just let Zork be Zork, meant they were never even tempted to turn their game into some over-baked Final Statement.

In games as in life, it’s always better to celebrate what we have than to lament what might have been. With that in mind, then, let me warmly recommend Zork: Grand Inquisitor to any fans of adventure games among you readers who have managed not to play it yet. It really doesn’t matter whether you know the rest of Zork or not; it stands just fine on its own. And that too is the way it ought to be.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: the books Zork Nemesis: The Official Strategy Guide by Peter Spear and Zork: Grand Inquisitor: The Official Strategy Guide by Margaret Stohl; Computer Gaming World of August 1996, February 1997, and March 1998; InterActivity of May 1996; Next Generation of August 1997; Los Angeles Times of November 30 1996.

Online sources include a 1996 New Media profile of Activision and “The Trance Experience of Zork Nemesis at Animation World.

My thanks to Laird Malamed for taking the time from his busy schedule to talk to me about his history with Zork. Note that any opinions expressed in this article that are not explicitly attributed to him are my own.

Zork Nemesis and Zork: Grand Inquisitor are both available as digital purchases at GOG.com.

Footnotes

Footnotes
1 The CD-ROM version included fourteen games, missing only Leather Goddesses of Phobos, which Activision attempted to market separately on the theory that sex sells itself. The floppy version included eleven games, lacking additionally three of Infocom’s late illustrated text adventures.
2 In my last article, about Cyan’s Riven, I first wrote that Zork Nemesis sold 450,000 copies. This figure was not accurate; I was misreading one of my sources. My bad, as I think the kids are still saying these days. I’ve already made the necessary correction there.
3 This “Infocom hard drive” eventually escaped the privileged hands into which it was entrusted, going on to cause some minor scandals and considerable interpersonal angst; suffice to say that not all of its contents were non-embarrassing. I have never had it in my possession. No, really, I haven’t. It’s been rendered somewhat moot in recent years anyway by the stellar work Jason Scott has done collecting primary sources for the Infocom story at archive.org.
 

Tags: , , , ,

The Rise of POMG, Part 1: It Takes a Village…

No one on their deathbed ever said, “I wish I had spent more time alone with my computer!”

— Dani Bunten Berry

If you ever want to feel old, just talk to the younger generation.

A few years ago now, I met the kids of a good friend of mine for the very first time: four boys between the ages of four and twelve, all more or less crazy about videogames. As someone who spends a lot of his time and earns a lot of his income writing about games, I arrived at their house with high expectations attached.

Alas, I’m afraid I proved a bit of a disappointment to them. The distance between the musty old games that I knew and the shiny modern ones that they played was just too far to bridge; shared frames of reference were tough to come up with. This was more or less what I had anticipated, given how painfully limited I already knew my knowledge of modern gaming to be. But one thing did genuinely surprise me: it was tough for these youngsters to wrap their heads around the very notion of a game that you played to completion by yourself and then put on the shelf, much as you might a book. The games they knew, from Roblox to Fortnite, were all social affairs that you played online with friends or strangers, that ended only when you got sick of them or your peer group moved on to something else. Games that you played alone, without at the very least leader boards and achievements on-hand to measure yourself against others, were utterly alien to them. It was quite a reality check for me.

So, I immediately started to wonder how we had gotten to this point — a point not necessarily better or worse than the sort of gaming that I knew growing up and am still most comfortable with, just very different. This series of articles should serve as the beginning of an answer to that complicated question. Their primary focus is not so much how computer games went multiplayer, nor even how they first went online; those things are in some ways the easy, obvious parts of the equation. It’s rather how games did those things persistently — i.e., permanently, so that each session became part of a larger meta-game, if you will, embedded in a virtual community. Or perhaps the virtual community is embedded in the game. It all depends on how you look at it, and which precise game you happen to be talking about. Whichever way, it has left folks like me, whose natural tendency is still to read games like books with distinct beginnings, middles, and ends, anachronistic iconoclasts in the eyes of the youthful mainstream.

Which, I hasten to add, is perfectly okay; I’ve always found the ditch more fun than the middle of the road anyway. Still, sometimes it’s good to know how the other 90 percent lives, especially if you claim to be a gaming historian…



“Persistent online multiplayer gaming” (POMG, shall we say?) is a mouthful to be sure, but it will have to do for lack of a better descriptor of the phenomenon that has created such a divide between myself and my friend’s children.  It’s actually older than you might expect, having first come to be in the 1970s on PLATO, a non-profit computer network run out of the University of Illinois but encompassing several other American educational institutions as well. Much has been written about this pioneering network, which uncannily presaged in so many of its particulars what the Internet would become for the world writ large two decades later. (I recommend Brian Dear’s The Friendly Orange Glow for a book-length treatment.) It should suffice for our purposes today to say that PLATO became host to, among other online communities of interest, an extraordinarily vibrant gaming culture. Thanks to the fact that PLATO games lived on a multi-user network rather than standalone single-user personal computers, they could do stuff that most gamers who were not lucky enough to be affiliated with a PLATO-connected university would have to wait many more years to experience.

The first recognizable single-player CRPGs were born on PLATO in the mid-1970s, inspired by the revolutionary new tabletop game known as Dungeons & Dragons. They were followed by the first multiplayer ones in amazingly short order. Already in 1975’s Moria,[1]The PLATO Moria was a completely different game from the 1983 single-player roguelike that bore the same name. players met up with their peers online to chat, brag, and sell or trade loot to one another. When they were ready to venture forth to kill monsters, they could do so in groups of up to ten, pooling their resources and sharing the rewards. A slightly later PLATO game called Oubliette implemented the same basic concept in an even more sophisticated way. The degree of persistence of these games was limited by a lack of storage capacity — the only data that was saved between sessions were the statistics and inventory of each player’s character, with the rest of the environment being generated randomly each time out — but they were miles ahead of anything available for the early personal computers that were beginning to appear at the same time. Indeed, Wizardry, the game that cemented the CRPG’s status as a staple genre on personal computers in 1981, was in many ways simply a scaled-down version of Oubliette, with the multiplayer party replaced by a party of characters that were all controlled by the same player.

Chester Bolingbroke, better known online as The CRPG Addict, plays Moria. Note the “Group Members” field at bottom right. Chester is alone here, but he could be adventuring with up to nine others.

A more comprehensive sort of persistence arrived with the first Multi-User Dungeon (MUD), developed by Roy Trubshaw and Richard Bartle, two students at the University of Essex in Britain, and first deployed there in a nascent form in late 1978 or 1979. A MUD borrowed the text-only interface and presentation of Will Crowther and Don Woods’s seminal game of Adventure, but the world it presented was a shared, fully persistent one between its periodic resets to a virgin state, chockablock with other real humans to interact with and perhaps fight. “The Land,” as Bartle dubbed his game’s environs, expanded to more than 600 rooms by the early 1980s, even as its ideas and a good portion of its code were used to set up other, similar environments at many more universities.

In the meanwhile, the first commercial online services were starting up in the United States. By 1984, you could, for the price of a substantial hourly fee, dial into the big mainframes of services like CompuServe using your home computer. Once logged in there, you could socialize, shop, bank, make travel reservations, read newspapers, and do much else that most people wouldn’t begin to do online until more than a decade later — including gaming. For example, CompuServe offered MegaWars, a persistent grand-strategy game of galactic conquest whose campaigns took groups of up to 100 players four to six weeks to complete. (Woe betide the ones who couldn’t log in for some reason of an evening in the midst of that marathon!) You could also find various MUDs, as well as Island of Kesmai, a multiplayer CRPG boasting most of the same features as PLATO’s Oubliette in a genuinely persistent world rather than a perpetually regenerated one. CompuServe’s competitor GEnie had Air Warrior, a multiplayer flight simulator with bitmapped 3D graphics and sound effects to rival any of the contemporaneous single-player simulators on personal computers. For the price of $11 per hour, you could participate in grand Air Warrior campaigns that lasted three weeks each and involved hundreds of other subscribers, organizing and flying bombing raids and defending against the enemy’s attacks on their own lines. In 1991, America Online put up Neverwinter Nights,[2]Not the same game as the 2002 Bioware CRPG of the same name. which did for the “Gold Box” line of licensed Dungeons & Dragons CRPGs what MUD had done for Adventure and Air Warrior had done for flight simulators, transporting the single-player game into a persistent multiplayer space.

All of this stuff was more or less incredible in the context of the times. At the same time, though, we mustn’t forget that it was strictly the purview of a privileged elite, made up of those with login credentials for institutional-computing networks or money in their pockets to pay fairly exorbitant hourly fees to feed their gaming habits. So, I’d like to back up now and tell a different story of POMG — one with more of a populist thrust, focusing on what was actually attainable by the majority of people out there, the ones who neither had access to a university’s mainframe nor could afford to spend hundreds of dollars per month on a hobby. Rest assured that the two narratives will meet before all is said and done.



POMG came to everyday digital gaming in the reverse order of the words that make up the acronym: first games were multiplayer, then they went online, and then these online games became persistent. Let’s try to unpack how that happened.

From the very start, many digital games were multiplayer, optionally if not unavoidably so. Spacewar!, the program generally considered the first fully developed graphical videogame, was exclusively multiplayer from its inception in the early 1960s. Ditto Pong, the game that launched Atari a decade later, and with it a slow-building popular craze for electronic games, first in public arcades and later in living rooms. Multiplayer here was not so much down to design intention as technological affordances. Pong was an elaborate analog state machine rather than a full-blown digital computer, relying on decentralized resistors and potentiometers and the like to do its “thinking.” It was more than hard enough just to get a couple of paddles and a ball moving around on the screen of a gadget like this; a computerized opponent was a bridge too far.

Very quickly, however, programmable microprocessors entered the field, changing everyone’s cost-benefit analyses. Building dual controls into an arcade cabinet was expensive, and the end result tended to take up a lot of space. The designers of arcade classics like Asteroids and Galaxian soon realized that they could replace the complications of a human opponent with hordes of computer-controlled enemies, flying in rudimentary, partially randomized patterns. Bulky multiplayer machines thus became rarer and rarer in arcades, replaced by slimmer, more standardized single-player cabinets. After all, if you wanted to compete with your friends in such games, there was still a way to do so: you could each play a round against the computerized enemies and compare your scores afterward.

While all of this was taking shape, the Trinity of 1977 — the Radio Shack TRS-80, Apple II, and Commodore PET — had ushered in the personal-computing era. The games these early microcomputers played were sometimes ports or clones of popular arcade hits, but just as often they were more cerebral, conceptually ambitious affairs where reflexes didn’t play as big — or any — role: flight simulations, adventure games, war and other strategy games. The last were often designed to be played optimally or even exclusively against another human, largely for the same reason Pong had been made that way: artificial intelligence was a hard thing to implement under any circumstances on an 8-bit computer with as little as 16 K of memory, and it only got harder when you were asking said artificial intelligence to formulate a strategy for Operation Barbarossa rather than to move a tennis racket around in front of a bouncing ball. Many strategy-game designers in these early days saw multiplayer options almost as a necessary evil, a stopgap until the computer could fully replace the human player, thus alleviating that eternal problem of the war-gaming hobby on the tabletop: the difficulty of finding other people in one’s neighborhood who were able and willing to play such weighty, complex games.

At least one designer, however, saw multiplayer as a positive advantage rather than a kludge — in fact, as the way the games of the future by all rights ought to be. “When I was a kid, the only times my family spent together that weren’t totally dysfunctional were when we were playing games,” remembered Dani Bunten Berry. From the beginning of her design career in 1979, when she made an auction game called Wheeler Dealers for the Apple II,[3]Wheeler Dealers and all of her other games that are mentioned in this article were credited to Dan Bunten, the name under which she lived until 1992. multiplayer was her priority. In fact, she was willing to go to extreme lengths to make it possible; in addition to a cassette tape containing the software, Wheeler Dealers shipped with a custom-made hardware add-on, the only method she could come up with to let four players bid at once. Such experiments culminated in M.U.L.E., one of the first four games ever published by Electronic Arts, a deeply, determinedly social game of economics and, yes, auctions for Atari and Commodore personal computers that many people, myself included, still consider her unimpeachable masterpiece.

A M.U.L.E. auction in progress.

And yet it was Seven Cities of Gold, her second game for Electronic Arts, that became a big hit. Ironically, it was also the first she had ever made with no multiplayer option whatsoever. She was learning to her chagrin that games meant to be played together on a single personal computer were a hard sell; such machines were typically found in offices and bedrooms, places where people went to isolate themselves, not in living rooms or other spaces where they went to be together. She decided to try another tack, thereby injecting the “online” part of POMG into our discussion.

In 1988, Electronic Arts published Berry’s Modem Wars, a game that seems almost eerily prescient in retrospect, anticipating the ludic zeitgeist of more than a decade later with remarkable accuracy. It was a strategy game played in real time (although not quite a real-time strategy of the resource-gathering and army-building stripe that would later be invented by Dune II and popularized by Warcraft and Command & Conquer). And it was intended to be played online against another human sitting at another computer, connected to yours by the gossamer thread of a peer-to-peer modem hookup over an ordinary telephone line. Like most of Berry’s games, it didn’t sell all that well, being a little too far out in front of the state of her nation’s telecommunications infrastructure.

Nevertheless, she continued to push her agenda of computer games as ways of being entertained together rather than alone over the years that followed. She never did achieve the breakout hit she craved, but she inspired countless other designers with her passion. She died far too young in 1998, just as the world was on the cusp of embracing her vision on a scale that even she could scarcely have imagined. “It is no exaggeration to characterize her as the world’s foremost authority on multiplayer computer games,” said Brian Moriarty when he presented Dani Bunten Berry with the first ever Game Developers Conference Lifetime Achievement Award two months before her death. “Nobody has worked harder to demonstrate how technology can be used to realize one of the noblest of human endeavors: bringing people together. Historians of electronic gaming will find in these eleven boxes [representing her eleven published games] the prototypes of the defining art form of the 21st century.” Let this article and the ones that will follow it, written well into said century, serve as partial proof of the truth of his words.

Danielle Bunten Berry, 1949-1998.

For by the time Moriarty spoke them, other designers had been following the trails she had blazed for quite some time, often with much more commercial success. A good early example is Populous, Peter Molyneux’s strategy game in real time (although, again, not quite a real-time strategy) that was for most of its development cycle strictly a peer-to-peer online multiplayer game, its offline single-player mode being added only during the last few months. An even better, slightly later one is DOOM, John Carmack and John Romero’s game of first-person 3D mayhem, whose star attraction, even more so than its sadistic single-player levels, was the “deathmatch” over a local-area network. Granted, these testosterone-fueled, relentlessly zero-sum contests weren’t quite the same as what Berry was envisioning for gaming’s multiplayer future near the end of her life; she wished passionately for games with a “people orientation,” directed toward “the more mainstream, casual players who are currently coming into the PC market.” Still, as the saying goes, you have to start somewhere.

But there is once more a caveat to state here about access, or rather the lack thereof. Being built for local networks only — i.e., networks that lived entirely within a single building or at most a small complex of them — DOOM deathmatches were out of reach on a day-to-day basis for those who didn’t happen to be students or employees at institutions with well-developed data-processing departments and permissive or oblivious authority figures. Outside of those ivory towers, this was the era of the “LAN party,” when groups of gamers would all lug their computers over to someone’s house, wire them together, and go at it over the course of a day or a weekend. These occasions went on to become treasured memories for many of their participants, but they achieved that status precisely because they were so sporadic and therefore special.

And yet DOOM‘s rise corresponded with the transformation of the Internet from an esoteric tool for the technological elite to the most flexible medium of communication ever placed at the disposal of the great unwashed, thanks to a little invention out of Switzerland called the World Wide Web. What if there was a way to move DOOM and other games like it from a local network onto this one, the mother of all wide-area networks? Instead of deathmatching only with your buddy in the next cubicle, you would be able to play against somebody on another continent if you liked. Now wouldn’t that be cool?

The problem was that local-area networks ran over a protocol known as IPX, while the Internet ran on a completely different one called TCP/IP. Whoever could bridge that gap in a reasonably reliable, user-friendly way stood to become a hero to gamers all over the world.



Jay Cotton discovered DOOM in the same way as many another data-processing professional: when it brought down his network. He was employed at the University of Georgia at the time, and was assigned to figure out why the university’s network kept buckling under unprecedented amounts of spurious traffic. He tracked the cause down to DOOM, the game that half the students on campus seemed to be playing more than half the time. More specifically, the problem was caused by a bug, which was patched out of existence by John Carmack as soon as he was informed. Problem solved. But Cotton stuck around to play, the warden seduced by the inmates of the asylum.

He was soon so much better at the game than anyone else on campus that he was getting a bit bored. Looking for worthier opponents, he stumbled across a program called TCPSetup, written by one Jake Page, which was designed to translate IPX packets into TCP/IP ones and vice versa on the fly, “tricking” DOOM into communicating across the vast Internet. It was cumbersome to use and extremely unreliable, but on a good day it would let you play DOOM over the Internet for brief periods of time at least, an amazing feat by any standard. Cotton would meet other players on an Internet chat channel dedicated to the game, they’d exchange IP addresses, and then they’d have at it — or try to, depending on the whims of the Technology Gods that day.

On August 22, 1994, Cotton received an email from a fellow out of the University of Illinois — yes, PLATO’s old home — whom he’d met and played in this way (and beaten, he was always careful to add). His name was Scott Coleman. “I have some ideas for hacking TCPSetup to make it a little easier. Care to do some testing later?” Coleman wrote. “I’ve already emailed Jake [Page] on this, but he hasn’t responded (might be on vacation or something). If he approves, I’m hoping some of these ideas might make it into the next release of TCPSetup. In the meantime, I want to do some experimenting to see what’s feasible.”

Jake Page never did respond to their queries, so Cotton and Coleman just kept beavering away on their own, eventually rewriting TCPSetup entirely to create iDOOM, a more reliable and far less fiddly implementation of the same concept, with support for three- or four-player deathmatches instead of just one-on-one duels. It took off like a rocket; the pair were bombarded with feature requests, most notably to make iDOOM work with other IPX-only games as well. In January of 1995, they added support for Heretic, one of the most popular of the first wave of so-called “DOOM clones.” They changed their program’s name to “iFrag” to reflect the fact that it was now about more than just DOOM.

Having come this far, Cotton and Coleman soon made the conceptual leap that would transform their software from a useful tool to a way of life for a time for many, many thousands of gamers. Why not add support for more games, they asked themselves, not in a bespoke way as they had been doing to date, but in a more sustainable one, by turning their program into a general-purpose IPX-to-TCP/IP bridge, suitable for use with the dozens of other multiplayer games out there that supported only local-area networks out of the box. And why not make their tool into a community while they were at it, by adding an integrated chat service? In addition to its other functions, the program could offer a list of “servers” hosting games, which you could join at the click of a button; no more trolling for opponents elsewhere on the Internet, then laboriously exchanging IP addresses and meeting times and hoping the other guy followed through. This would be instant-gratification online gaming. It would also provide a foretaste at least of persistent online multiplayer gaming; as people won matches, they would become known commodities in the community, setting up a meta-game, a sporting culture of heroes and zeroes where folks kept track of win-loss records and where everybody clamored to hear the results when two big wheels faced off against one another.

Cotton and Coleman renamed their software for the third time in less than nine months, calling it Kali, a name suggested by Coleman’s Indian-American girlfriend (later his wife). “The Kali avatar is usually depicted with swords in her hands and a necklace of skulls from those she has killed,” says Coleman, “which seemed appropriate for a deathmatch game.” Largely at the behest of Cotton, always the more commercially-minded of the pair, they decided to make Kali shareware, just like DOOM itself: multiplayer sessions would be limited to fifteen minutes at a time until you coughed up a $20 registration fee. Cotton went through the logistics of setting up and running a business in Georgia while Coleman did most of the coding in Illinois. (Rather astonishingly, Cotton and Coleman had still never met one another face to face in 2013, when gaming historian David L. Craddock conducted an interview with them that has been an invaluable source of quotes and information for this article.)

Kali certainly wasn’t the only solution in this space; a commercial service called DWANGO had existed since December of 1994, with the direct backing of John Carmack and John Romero, whose company id Software collected 20 percent of its revenue in return for the endorsement. But DWANGO ran over old-fashioned direct-dial-up connections rather than the Internet, meaning you had to pay long-distance charges to use it if you weren’t lucky enough to live close to one of its host computers. On top of that, it charged $9 for just five hours of access per month, with the fees escalating from there. Kali, by contrast, was available to you forever for as many hours per month as you liked after you plunked down your one-time fee of $20.

So, Kali was popular right from its first release on April 26, 1995. Yet it was still an awkward piece of software for the casual user despite the duo’s best efforts, being tied to MS-DOS, whose support for TCP/IP relied on a creaky edifice of third-party tools. The arrival of Windows 95 was a godsend for Kali, as it was for computer gaming in general, making the hobby accessible in a way it had never been before. The so-called “Kali95” was available by early 1996, and things exploded from there. Kali struck countless gamers with all the force of a revelation; who would have dreamed that it could be so easy to play against another human online? Lloyd Case, for example, wrote in Computer Gaming World magazine that using Kali for the first time was “one of the most profound gaming experiences I’ve had in a long time.” Reminiscing seventeen years later, David L. Craddock described how “using Kali for the first time was like magic. Jumping into a game and playing with other people. It blew my fourteen-year-old mind.” In late 1996, the number of registered Kali users ticked past 50,000, even as quite possibly just as many or more were playing with cracked versions that bypassed the simplistic serial-number-registration process. First-person-shooter deathmatches abounded, but you could also play real-time strategies like Command & Conquer and Warcraft, or even the Links golf simulation. Computer Gaming World gave Kali a special year-end award for “Online-Enabling Technology.”

Kali for Windows 95.

Competitors were rushing in at a breakneck pace by this time, some of them far more conventionally “professional” than Kali, whose origin story was, as we’ve seen, as underground and organic as that of DOOM itself. The most prominent of the venture-capital-funded startups were MPlayer (co-founded by Brian Moriarty of Infocom and LucasArts fame, and employing Dani Bunten Berry as a consultant during the last months of her life) and the Total Entertainment Network, better known as simply TEN. In contrast to Kali’s one-time fee, they, like DWANGO before them, relied on subscription billing: $20 per month for MPlayer, $15 per month for TEN. Despite slick advertising and countless other advantages that Kali lacked, neither would ever come close to overtaking its scruffy older rival, which had price as well as oodles of grass-roots goodwill on its side. Jay Cotton:

It was always my belief that Kali would continue to be successful as long as I never got greedy. I wanted everyone to be so happy with their purchase that they would never hesitate to recommend it to a friend. [I would] never charge more than someone would be readily willing to pay. It also became a selling point that Kali only charged a one-time fee, with free upgrades forever. People really liked this, and it prevented newcomers (TEN, Heat [a service launched in 1997 by Sega of America], MPlayer, etc.) from being able to charge enough to pay for their expensive overheads.

Kali was able to compete with TEN, MPlayer, and Heat because it already had a large established user base (more users equals more fun) and because it was much, much cheaper. These new services wanted to charge a subscription fee, but didn’t provide enough added benefit to justify the added expense.

It was a heady rush indeed, although it would also prove a short-lived one; Kali’s competitors would all be out of business within a year or so of the turn of the millennium. Kali itself stuck around after that, but as a shadow of what it had been, strictly a place for old-timers to reminisce and play the old hits. “I keep it running just out of habit,” said Jay Cotton in 2013. “I make just enough money on website ads to pay for the server.” It still exists today, presumably as a result of the same force of habit.

One half of what Kali and its peers offered was all too obviously ephemeral from the start: as the Internet went mainstream, developers inevitably began building TCP/IP support right into their games, eliminating the need for an external IPX-to-TCP/IP bridge. (For example, Quake, id Software’s much-anticipated follow-up to DOOM, did just this when it finally arrived in 1996.) But the other half of what they offered was community, which may have seemed a more durable sort of benefit. As it happened, though, one clever studio did an end-run around them here as well.



The folks at Blizzard Entertainment, the small studio and publisher that was fast coming to rival id Software for the title of the hottest name in gaming, were enthusiastic supporters of Kali in the beginning, to the point of hand-tweaking Warcraft II, their mega-hit real-time strategy, to run optimally over the service. They were rewarded by seeing it surpass even DOOM to become the most popular game there of all. But as they were polishing their new action-CRPG Diablo for release in 1996, Mike O’Brien, a Blizzard programmer, suggested that they launch their own service that would do everything Kali did in terms of community, albeit for Blizzard’s games alone. And then he additionally suggested that they make it free, gambling that knowledge of its existence would sell enough games for them at retail to offset its maintenance costs. Blizzard’s unofficial motto had long been “Let’s be awesome,” reflecting their determination to sell exactly the games that real hardcore gamers were craving, honed to a perfect finish, and to always give them that little bit extra. What better way to be awesome than by letting their customers effortlessly play and socialize online, and to do so for free?

The idea was given an extra dollop of urgency by the fact that Westwood Games, the maker of Warcraft‘s chief competitor Command & Conquer, had introduced a service called Westwood Chat that could launch people directly into a licensed version of Monopoly. (Shades of Dani Bunten Berry’s cherished childhood memories…) At the moment it supported only Monopoly, a title that appealed to a very different demographic from the hardcore crowd who favored Blizzard’s games, but who knew how long that would last?[4]Westwood Chat would indeed evolve eventually into Westwood Online, with full support for Command & Conquer, but that would happen only after Blizzard had rolled out their own service.

So, when Diablo shipped in the last week of 1996, it included something called Battle.net, a one-click chat and matchmaking service and multiplayer facilitator. Battle.net made everything easier than it had ever been before. It would even automatically patch your copy of the game to the latest version when you logged on, pioneering the “software as a service” model in gaming that has become everyday life in our current age of Steam. “It was so natural,” says Blizzard executive Max Schaefer. “You didn’t think about the fact that you were playing with a dude in Korea and a guy in Israel. It’s really a remarkable thing when you think about it. How often are people casually matched up in different parts of the world?” The answer to that question, of course, was “not very often” in the context of 1997. Today, it’s as normal as computers themselves, thanks to groundbreaking initiatives like this one. Blizzard programmer Jeff Strain:

We believed that in order for it [Battle.net] to really be embraced and adopted, that accessibility had to be there. The real catch for Battle.net was that it was inside-out rather than outside-in. You jumped right into the game. You connected players from within the game experience. You did not alt-tab off into a Web browser to set up your games and have the Web browser try to pass off information or something like that. It was a service designed from Day One to be built into actual games.

The combination of Diablo and Battle.net brought a new, more palpable sort of persistence to online gaming. Players of DOOM or Warcraft II might become known as hotshots on services like Kali, but their reputation conferred no tangible benefit once they entered a game session. A DOOM deathmatch or a Warcraft II battle was a one-and-done event, which everyone started on an equal footing, which everyone would exit again within an hour or so, with nothing but memories and perhaps bragging rights to show for what had transpired.

Diablo, however, was different. Although less narratively and systemically ambitious than many of its recent brethren, it was nevertheless a CRPG, a genre all about building up a character over many gaming sessions. Multiplayer Diablo retained this aspect: the first time you went online, you had to pick one of the three pre-made first-level characters to play, but after that you could keep bringing the same character back to session after session, with all of the skills and loot she had already collected. Suddenly the link between the real people in the chat rooms and their avatars that lived in the game proper was much more concrete. Many found it incredibly compelling. People started to assume the roles of their characters even when they were just hanging out in the chat rooms, started in some very real sense to live the game.

But it wasn’t all sunshine and roses. Battle.net became a breeding ground of the toxic behaviors that have continued to dog online gaming to this day, a social laboratory demonstrating what happens when you take a bunch of hyper-competitive, rambunctious young men and give them carte blanche to have at it any way they wish with virtual swords and spells. The service was soon awash with “griefers,” players who would join others on their adventures, ostensibly as their allies in the dungeon, then literally stab them in the back when they least expected it, killing their characters and running off with all of their hard-won loot. The experience could be downright traumatizing for the victims, who had thought they were joining up with friendly strangers simply to have fun together in a cool new game. “Going online and getting killed was so scarring,” acknowledges David Brevick, Diablo‘s original creator. “Those players are still feeling a little bit apprehensive.”

To make matters worse, many of the griefers were also cheaters. Diablo had been born and bred a single-player game; multiplayer had been a very late addition. This had major ramifications. Diablo stored all the information about the character you played online on your local hard drive rather than the Battle.net server. Learn how to modify this file, and you could create a veritable god for yourself in about ten minutes, instead of the dozens of hours it would take playing the honest way. “Trainers” — programs that could automatically do the necessary hacking for you — spread like wildfire across the Internet. Other folks learned to hack the game’s executable files themselves. Most infamously, they figured out ways to attack other players while they were still in the game’s above-ground town, supposedly a safe space reserved for shopping and healing. Battle.net as a whole took on a siege mentality, as people who wanted to play honorably and honestly learned to lock the masses out with passwords that they exchanged only with trusted friends. This worked after a fashion, but it was also a betrayal of the core premise and advantage of Battle.net, the ability to find a quick pick-up game anytime you wanted one. Yet there was nothing Blizzard could do about it without rewriting the whole game from the ground up. They would eventually do this — but they would call the end result Diablo II. In the meanwhile, it was a case of player beware.

It’s important to understand that, for all that it resembled what would come later all too much from a sociological perspective, multiplayer Diablo was still no more persistent than Moria and Oubliette had been on the old PLATO network: each player’s character was retained from session to session, but nothing about the state of the world. Each world, or instance of the game, could contain a maximum of four human players, and disappeared as soon as the last player left it, leaving as its legacy only the experience points and items its inhabitants had collected from it while it existed. Players could and did kill the demon Diablo, the sole goal of the single-player game, one that usually required ten hours or more of questing to achieve, over and over again in the online version. In this sense, multiplayer Diablo was a completely different game from single-player Diablo, replacing the simple quest narrative of the latter with a social meta-game of character-building and player-versus-player combat.

For lots and lots of people, this was lots and lots of fun; Diablo was hugely popular despite all of the exploits it permitted — indeed, for some players perchance, because of them. It became one of the biggest computer games of the 1990s, bringing online gaming to the masses in a way that even Kali had never managed. Yet there was still a ways to go to reach total persistence, to bring a permanent virtual world to life. Next time, then, we’ll see how mainstream commercial games of the 1990s sought to achieve a degree of persistence that the first MUD could boast of already in 1979. These latest virtual worlds, however, would attempt to do so with all the bells and whistles and audiovisual niceties that a new generation of gamers raised on multimedia and 3D graphics demanded. An old dog in the CRPG space was about to learn a new trick, creating in the process a new gaming acronym that’s even more of a mouthful than POMG.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: the books Stay Awhile and Listen Volumes 1 and 2 by David L. Craddock, Masters of Doom by David Kushner, and The Friendly Orange Glow by Brian Dear; Retro Gamer 43, 90, and 103; Computer Gaming World of September 1996 and May 1997; Next Generation of March 1997. Online sources include “The Story of Battle.net” by Wes Fenlon at PC Gamer, Dan Griliopoulos’s collection of interviews about Command & Conquer, Brian Moriarty’s speech honoring Dani Bunten Berry from the 1998 Game Developers Conference, and Jay Cotton’s history of Kali on the DOOM II fan site. Plus some posts on The CRPG Addict, to which I’ve linked in the article proper.

Footnotes

Footnotes
1 The PLATO Moria was a completely different game from the 1983 single-player roguelike that bore the same name.
2 Not the same game as the 2002 Bioware CRPG of the same name.
3 Wheeler Dealers and all of her other games that are mentioned in this article were credited to Dan Bunten, the name under which she lived until 1992.
4 Westwood Chat would indeed evolve eventually into Westwood Online, with full support for Command & Conquer, but that would happen only after Blizzard had rolled out their own service.
 
 

Tags: , , , , , , , , , , , , ,

Putting the “J” in the RPG, Part 1: Dorakue!


Fair warning: this article includes some plot spoilers of Final Fantasy I through VI.

The videogame industry has always run on hype, but the amount of it that surrounded Final Fantasy VII in 1997 was unparalleled in its time. This new game for the Sony PlayStation console was simply inescapable. The American marketing teams of Sony and Square Corporation, the game’s Japanese developer and publisher, had been given $30 million with which to elevate Final Fantasy VII to the same status as the Super Marios of the world. They plastered Cloud, Aerith, Tifa, Sephiroth, and the game’s other soon-to-be-iconic characters onto urban billboards, onto the sides of buses, and into the pages of glossy magazines like Rolling Stone, Playboy, and Spin. Commercials for the game aired round the clock on MTV, during NFL games and Saturday Night Live, even on giant cinema screens in lieu of more traditional coming-attractions trailers. “They said it couldn’t be done in a major motion picture,” the stentorian announcer intoned. “They were right!” Even if you didn’t care a whit about videogames, you couldn’t avoid knowing that something pretty big was going down in that space.

And if you did care… oh, boy. The staffs of the videogame magazines, hardly known for their sober-mindedness in normal times, worked themselves up to positively orgasmic heights under Square’s not-so-gentle prodding. GameFan told its readers that Final Fantasy VII would be “unquestionably the greatest entertainment product ever created.”

The game is ridiculously beautiful. Analyze five minutes of gameplay in Final Fantasy VII and witness more artistic prowess than most entire games have. The level of detail is absolutely astounding. These graphics are impossible to describe; no words are great enough. Both map and battle graphics are rendered to a level of detail completely unprecedented in the videogame world. Before Final Fantasy VII, I couldn’t have imagined a game looking like this for many years, and that’s no exaggeration. One look at a cut scene or call spell should handily convince you. Final Fantasy VII looks so consistently great that you’ll quickly become numb to the power. Only upon playing another game will you once again realize just how fantastic it is.

But graphics weren’t all that the game had going for it. In fact, they weren’t even the aspect that would come to most indelibly define it for most of its players. No… that thing was, for the very first time in a mainstream console-based videogame with serious aspirations of becoming the toppermost of the poppermost, the story.

I don’t have any room to go into the details, but rest assured that Final Fantasy VII possesses the deepest, most involved story line ever in an RPG. There’s few games that have literally caused my jaw to drop at plot revelations, and I’m most pleased to say that Final Fantasy VII doles out these shocking, unguessable twists with regularity. You are constantly motivated to solve the latest mystery.

So, the hype rolled downhill, from Square at the top to the mass media, then on to the hardcore gamer magazines to ordinary owners of PlayStations. You would have to have been an iconoclastic PlayStation owner indeed not to be shivering with anticipation as the weeks counted down toward the game’s September 7 release. (Owners of other consoles could eat their hearts out; Final Fantasy VII was a PlayStation exclusive.)

Just last year, a member of an Internet gaming forum still fondly recalled how

the lead-up for the US launch of this game was absolutely insane, and, speaking personally, it is the most excited about a game I think I had ever been in my life, and nothing has come close since then. I was only fifteen at the time, and this game totally overtook all my thoughts and imagination. I had never even played a Final Fantasy game before, and I didn’t even like RPGs, yet I would spend hours reading and rereading all the articles from all the gaming magazines I had, inspecting all the screenshots and being absolutely blown away at the visual fidelity I was witnessing. I spent multiple days/hours with my Sony Discman listening to music and drawing the same artwork that was in all the mags. It was literally a genre- and generation-defining game.

Those who preferred to do their gaming on personal computers rather than consoles might be excused for scoffing at all these breathless commentators who seemed to presume that Final Fantasy VII was doing something that had never been done before. If you spent your days playing Quake, Final Fantasy VII‘s battle graphics probably weren’t going to impress you overmuch; if you knew, say, Toonstruck, even the cut scenes might strike you as pretty crude. And then, too, computer-based adventure games and RPGs had been delivering well-developed long-form interactive narratives for many years by 1997, most recently with a decidedly cinematic bent more often than not, with voice actors in place of Final Fantasy VII‘s endless text boxes. Wasn’t Final Fantasy VII just a case of console gamers belatedly catching on to something computer gamers had known all along, and being forced to do so in a technically inferior fashion at that?

Well, yes and no. It’s abundantly true that much of what struck so many as so revelatory about Final Fantasy VII really wasn’t anywhere near as novel as they thought it was. At the same time, though, the aesthetic and design philosophies which it applied to the abstract idea of the RPG truly were dramatically different from the set of approaches favored by Western studios. They were so different, in fact, that the RPG genre in general would be forever bifurcated in gamers’ minds going forward, as the notion of the “JRPG” — the Japanese RPG — entered the gaming lexicon. In time, the label would be applied to games that didn’t actually come from Japan at all, but that evinced the set of styles and approaches so irrevocably cemented in the Western consciousness under the label of “Japanese” by Final Fantasy VII.

We might draw a parallel with what happened in music in the 1960s. The Beatles, the Rolling Stones, and all the other Limey bands who mounted the so-called “British Invasion” of their former Colonies in 1964 had all spent their adolescence steeped in American rock and roll. They took those influences, applied their own British twist to them, then sold them back to American teenagers, who screamed and fainted in the concert halls like Final Fantasy VII fans later would in the pages of the gaming magazines, convinced that the rapture they were feeling was brought on by something genuinely new under the sun — which in the aggregate it was, of course. It took the Japanese to teach Americans how thrilling and accessible — even how emotionally moving — the gaming genre they had invented could truly be.



The roots of the JRPG can be traced back not just to the United States but to a very specific place and time there: to the American Midwest in the early 1970s, where and when Gary Gygax and Dave Arneson, a pair of stolid grognards who would have been utterly nonplussed by the emotional histrionics of a Final Fantasy VII, created a “single-unit wargame” called Dungeons & Dragons. I wrote quite some years ago on this site that their game’s “impact on the culture at large has been, for better or for worse, greater than that of any single novel, film, or piece of music to appear during its lifetime.” I almost want to dismiss those words now as the naïve hyperbole of a younger self. But the thing is, I can’t; I have no choice but to stand by them. Dungeons & Dragons really was that earthshaking, not only in the obvious ways — it’s hard to imagine the post-millennial craze for fantasy in mass media, from the Lord of the Rings films to Game of Thrones, ever taking hold without it — but also in subtler yet ultimately more important ones, in the way it changed the role we play in our entertainments from that of passive spectators to active co-creators, making interactivity the watchword of an entire age of media.

The early popularity of Dungeons & Dragons coincided with the rise of accessible computing, and this proved a potent combination. Fans of the game with access to PLATO, a groundbreaking online community rooted in American universities, moved it as best they could onto computers, yielding the world’s first recognizable CRPGs. Then a couple of PLATO users named Robert Woodhead and Andrew Greenberg made a game of this type for the Apple II personal computer in 1981, calling it Wizardry. Meanwhile Richard Garriott was making Ultima, a different take on the same broad concept of “Dungeons & Dragons on a personal computer.”

By the time Final Fantasy VII stormed the gates of the American market so triumphantly in 1997, the cultures of gaming in the United States and Japan had diverged so markedly that one could almost believe they had never had much of anything to do with one another. Yet in these earliest days of digital gaming — long before the likes of the Nintendo Entertainment System, when Japanese games meant only coin-op arcade hits like Space Invaders, Pac-Man, and Donkey Kong in the minds of most Americans — there was in fact considerable cross-pollination. For Japan was the second place in the world after North America where reasonably usable, pre-assembled, consumer-grade personal computers could be readily purchased; the Japanese Sharp MZ80K and Hitachi MB-6880 trailed the American Trinity of 1977 — the Radio Shack TRS-80, Apple II, and Commodore PET — by less than a year. If these two formative cultures of computing didn’t talk to one another, whom else could they talk to?

Thus pioneering American games publishers like Sierra On-Line and Brøderbund forged links with counterparts in Japan. A Japanese company known as Starcraft became the world’s first gaming localizer, specializing in porting American games to Japanese computers and translating their text into Japanese for the domestic market. As late as the summer of 1985, Roe R. Adams III could write in Computer Gaming World that Sierra’s sprawling twelve-disk-side adventure game Time Zone, long since written off at home as a misbegotten white elephant, “is still high on the charts after three years” in Japan. Brøderbund’s platformer Lode Runner was even bigger, having swum like a salmon upstream in Japan, being ported from home computers to coin-op arcade machines rather than the usual reverse. It had even spawned the world’s first e-sports league, whose matches were shown on Japanese television.

At that time, the first Wizardry game and the second and third Ultima had only recently been translated and released in Japan. And yet if Adams was to be believed,[1]Adams was not an entirely disinterested observer. He was already working with Robert Woodhead on Wizardry IV, and had in fact accompanied him to Japan in this capacity. both games already

have huge followings. The computer magazines cover Lord British [Richard Garriott’s nom de plume] like our National Inquirer would cover a television star. When Robert Woodhead of Wizardry fame was recently in Japan, he was practically mobbed by autograph seekers. Just introducing himself in a computer store would start a near-stampede as people would run outside to shout that he was inside.

Robert Woodhead with Japanese Wizardry fans.

The Wizardry and Ultima pump had been primed in Japan by a game called The Black Onyx, created the year before in their image for the Japanese market by an American named Henk Rogers.[2]A man with an international perspective if ever there was one, Rogers would later go on to fame and fortune as the man who brought Tetris out of the Soviet Union. But his game was quickly eclipsed by the real deals that came directly out of the United States.

Wizardry in particular became a smashing success in Japan, even as a rather lackadaisical attitude toward formal and audiovisual innovation on the part of its masterminds was already condemning it to also-ran status against Ultima and its ilk in the United States. It undoubtedly helped that Wizardry was published in Japan by ASCII Corporation, that country’s nearest equivalent to Microsoft, with heaps of marketing clout and distributional muscle to bring to bear on any challenge. So, while the Wizardry series that American gamers knew petered out in somewhat anticlimactic fashion in the early 1990s after seven games,[3]It would be briefly revived for one final game, the appropriately named Wizardry 8, in 2001. it spawned close to a dozen Japanese-exclusive titles later in that decade alone, plus many more after the millennium, such that the franchise remains to this day far better known by everyday gamers in Japan than it is in the United States. Robert Woodhead himself spent two years in Japan in the early 1990s working on what would have been a Wizardry MMORPG, if it hadn’t proved to be just too big a mouthful for the hardware and telecommunications infrastructure at his disposal.

Box art helps to demonstrate Wizardry‘s uncanny legacy in Japan. Here we see the original 1981 American release of the first game.

And here we have a Japan-only Wizardry from a decade later, self-consciously echoing a foreboding, austere aesthetic that had become more iconic in Japan than it had ever been in its home country. (American Wizardry boxes from the period look nothing like this, being illustrated in a more conventional, colorful epic-fantasy style.)

Much of the story of such cultural exchanges inevitably becomes a tale of translation. In its original incarnation, the first Wizardry game had had the merest wisp of a plot. In this as in all other respects it was a classic hack-and-slash dungeon crawler: work your way down through ten dungeon levels and kill the evil wizard, finito. What background context there was tended to be tongue-in-cheek, more Piers Anthony than J.R.R. Tolkien; the most desirable sword in the game was called the “Blade of Cuisinart,” for Pete’s sake. Wizardry‘s Japanese translators, however, took it all in with wide-eyed earnestness, missing the winking and nodding entirely. They saw a rather grim, austere milieu a million miles away from the game that Americans knew — a place where a Cuisinart wasn’t a stainless-steel food processor but a portentous ancient warrior clan.

When the Japanese started to make their own Wizardry games, they continued in this direction, to almost hilarious effect if one knew the source material behind their efforts; it rather smacks of the post-apocalyptic monks in A Canticle for Liebowitz making a theology for themselves out of the ephemeral advertising copy of their pre-apocalyptic forebears. A franchise that had in its first several American releases aspired to be about nothing more than killing monsters for loot — and many of them aggressively silly monsters at that — gave birth to audio CDs full of po-faced stories and lore, anime films and manga books, a sprawling line of toys and miniature figures, even a complete tabletop RPG system. But, lest we Westerners begin to feel too smug about all this, know that the same process would eventually come to work in reverse in the JRPG field, with nuanced Japanese writing being flattened out and flat-out misunderstood by clueless American translators.

The history of Wizardry in Japan is fascinating by dint of its sheer unlikeliness, but the game’s importance on the global stage actually stems more from the Japanese games it influenced than from the ones that bore the Wizardry name right there on the box. For Wizardry, along with the early Ultima games, happened to catch the attention of Koichi Nakamura and Yuji Horii, a software-development duo who had already made several games together for a Japanese publisher called Enix. “Horii-san was really into Ultima, and I was really into Wizardry,” remembers Nakamura. This made sense. Nakamura was the programmer of the pair, naturally attracted to Wizardry‘s emphasis on tactics and systems. Horii, on the other hand, was the storytelling type, who wrote for manga magazines in addition to games, and was thus drawn to Ultima‘s quirkier, more sprawling world and its spirit of open-ended exploration. The pair decided to make their own RPG for the Japanese market, combining what they each saw as the best parts of Wizardry and Ultima.

Yuji Horii in the 1980s. Little known outside his home country, he is a celebrity inside its borders. In his book on Japanese videogame culture, Chris Kohler calls him a Steven Spielberg-like figure there, in terms both of name recognition and the style of entertainment he represents.

This was interesting, but not revolutionary in itself; you’ll remember that Henk Rogers had already done essentially the same thing in Japan with The Black Onyx before Wizardry and Ultima ever officially arrived there. Nevertheless, the choices Nakamura and Horii made as they set about their task give them a better claim to the title of revolutionaries on this front than Rogers enjoys. They decided that making a game that combined the best of Wizardry and Ultima really did mean just that: it did not mean, that is to say, throwing together every feature of each which they could pack in and calling it a day, as many a Western developer might have. They decided to make a game that was simpler than either of its inspirations, much less the two of them together.

Their reasons for doing so were artistic, commercial, and technical. In the realm of the first, Horii in particular just didn’t like overly complicated games; he was the kind of player who would prefer never to have to glance at a manual, whose ideal game intuitively communicated to you everything you needed to know in order to play it. In the realm of the second, the pair was sure that the average Japanese person, like the average person in most countries, felt the same as Horii; even in the United States, Ultima and Wizardry were niche products, and Nakamura and Horii had mass-market ambitions. And in the realm of the third, they were sharply limited in how much they could put into their RPG anyway, because they intended it for the Nintendo Famicom console, where their entire game — code, data, graphics, and sound — would have to fit onto a 64 K cartridge in lieu of floppy disks and would have to be steerable using an eight-button controller in lieu of a keyboard. Luckily, Nakamura and Horii already had experience with just this sort of simplification. Their most recent output had been inspired by the adventure games of American companies like Sierra and Infocom, but had replaced those games’ text parsers with controller-friendly multiple-choice menus.

In deciding to put American RPGs through the same wringer, they established one of the core attributes of the JRPG sub-genre: generally speaking, these games were and would remain simpler than their Western counterparts, which sometimes seemed to positively revel in their complexity as a badge of honor. Another attribute emerged fully-formed from the writerly heart of Yuji Horii. He crafted an unusually rich, largely linear plot for the game. Rather than being a disadvantage, he thought linearity would make this new style of console game “more accessible to consumers”: “We really focused on ensuring people would be able to experience the fun of the story.”

He called upon his friends at the manga magazines to help him illustrate his tale with large, colorful figures in that distinctly Japanese style that has become so immediately recognizable all over the world. At this stage, it was perhaps more prevalent on the box than in the game itself, the Famicom’s graphical fidelity being what it was. Nonetheless, another precedent that has held true in JRPGs right down to the present day was set by the overall visual aesthetic of this, the canonical first example of the breed. Ditto its audio aesthetic, which took the form of a memorable, melodic, eminently hummable chip-tune soundtrack. “From the very beginning, we wanted to create a warm, inviting world,” says Horii.

Dragon Quest. Ultima veterans will almost expect to meet Lord British on his throne somewhere. With its overhead view and its large over-world full of towns to be visited, Dragon Quest owed even more to Ultima than it did to Wizardry — unsurprisingly so, given that the former was the American RPG which its chief creative architect Yuji Horii preferred.

Dragon Quest was released on May 27, 1986. Console gamers — not only those in Japan, but anywhere on the globe — had never seen anything like it. Playing this game to the end was a long-form endeavor that could stretch out over weeks or months; you wrote down an alphanumeric code it provided to you on exit, then entered this code when you returned to the game in order to jump back to wherever you had left off.

That said, the fact that the entire game state could be packed into a handful of numbers and letters does serve to illustrate just how simple Dragon Quest really was at bottom. By the standards of only a few years later, much less today, it was pretty boring. Fighting random monsters wasn’t so much a distraction from the rest of the game as the only thing available to do; the grinding was the game. In 2012, critic Nick Simberg wondered at “how willing we were to sit down on the couch and fight the same ten enemies over and over for hours, just building up gold and experience points”; he compared Dragon Quest to “a child’s first crayon drawing, stuck with a magnet to the fridge.”

And yet, as the saying goes, you have to start somewhere. Japanese gamers were amazed and entranced, buying 1 million copies of Dragon Quest in its first six months, over 2 million copies in all. And so a new sub-genre was born, inspired by American games but indelibly Japanese in a way The Black Onyx had not been. Many or most of the people who played and enjoyed Dragon Quest had never even heard of its original wellspring Dungeons & Dragons.

We all know what happens when a game becomes a hit on the scale of Dragon Quest. There were sequels — two within two years of the first game, then three more in the eight years after them, as the demands of higher production values slowed down Enix’s pace a bit. Wizardry was big in Japan, but it was nothing compared to Dragon Quest, which sold 2.4 million copies in its second incarnation, followed by an extraordinary 3.8 million copies in its third. Middle managers and schoolmasters alike learned to dread the release of a new entry in the franchise, as about half the population of Japan under a certain age would invariably call in sick that day. When Enix started bringing out the latest games on non-business days, a widespread urban legend said this had been done in accordance with a decree from the Japanese Diet, which demanded that “henceforth Dragon Quest games are to be released on Sunday or national holidays only”; the urban legend wasn’t true, but the fact that so many people in Japan could so easily believe it says something in itself. Just as the early American game Adventure lent its name to an entire genre that followed it, the Japanese portmanteau word for “Dragon Quest” — Dorakue — became synonymous with the RPG in general there, such that when you told someone you were “playing dorakue” you might really be playing one of the series’s countless imitators.

Giving any remotely complete overview of these dorakue games would require dozens of articles, along with someone to write them who knows far more about them than I do. But one name is inescapable in the field. I refer, of course, to Final Fantasy.


Hironobu Sakaguchi in 1991.

Legend has it that Hironobu Sakaguchi, the father of Final Fantasy, chose that name because he thought that the first entry in the eventual franchise would be the last videogame he ever made. A former professional musician with numerous and diverse interests, Sakaguchi had been working for the Japanese software developer and publisher Square for a few years already by 1987, designing and programming Famicom action games that he himself found rather banal and that weren’t even selling all that well. He felt ready to do something else with his life, was poised to go back to university to try to figure out what that thing ought to be. But before he did so, he wanted to try something completely different at Square.

Another, less dramatic but probably more accurate version of the origin story has it that Sakaguchi simply liked the way the words “final’ and “fantasy” sounded together. At any rate, he convinced his managers to give him half a dozen assistants and six months to make a dorakue game.[4]In another unexpected link between East and West, one of his most important assistants became Nasir Gebelli, an Iranian who had fled his country’s revolution for the United States in 1979 and become a game-programming rock star on the Apple II. After the heyday of the lone-wolf bedroom auteur began to fade there, Doug Carlston, the head of Brøderbund, brokered a job for him with his friends in Japan. There he maximized the Famicom’s potential in the same way he had that of the Apple II, despite not speaking a word of Japanese when he arrived. (“We’d go to a restaurant and no matter what he’d order — spaghetti or eggs — they’d always bring out steak,” Sakaguchi laughs.) Gebelli would program the first three Final Fantasy games almost all by himself.

 

Final Fantasy I.

The very first Final Fantasy may not have looked all that different from Dragon Quest at first glance — it was still a Famicom game, after all, with all the audiovisual limitations that implies — but it had a story line that was more thematically thorny and logistically twisted than anything Yuji Horii might have come up with. As it began, you found yourself in the midst of a quest to save a princess from an evil knight, which certainly sounded typical enough to anyone who had ever played a dorakue game before. In this case, however, you completed that task within an hour, only to learn that it was just a prologue to the real plot. In his book-length history and study of the aesthetics of Japanese videogames, Chris Kohler detects an implicit message here: “Final Fantasy is about much more than saving the princess. Compared to the adventure that is about to take place, saving a princess is merely child’s play.” In fact, only after the prologue was complete did the opening credits finally roll, thus displaying another consistent quality of Final Fantasy: its love of unabashedly cinematic drama.

Still, for all that it was more narratively ambitious than what had come before, the first Final Fantasy can, like the first Dragon Quest, seem a stunted creation today. Technical limitations meant that you still spent 95 percent of your time just grinding for experience. “Final Fantasy may have helped build the genre, but it didn’t necessarily know exactly how to make it fun,” acknowledges Aidan Moher in his book about JRPGs. And yet when it came to dorakue games in the late 1980s, it seemed that Sakaguchi’s countrymen were happy to reward even the potential for eventual fun. They made Final Fantasy the solid commercial success that had heretofore hovered so frustratingly out of reach of its creator; it sold 400,000 copies. Assured that he would never have to work on a mindless action game again, Sakaguchi agreed to stay on at Square to build upon its template.

Final Fantasy II, which was released exactly one year after the first game in December of 1988 and promptly doubled its sales, added more essential pieces to what would become the franchise’s template. Although labelled and marketed as a sequel, its setting, characters, and plot had no relation to what had come before. Going forward, it would remain a consistent point of pride with Sakaguchi to come up with each new Final Fantasy from whole cloth, even when fans begged him for a reunion with their favorite places and people. In a world afflicted with the sequelitis that ours is, he can only be commended for sticking to his guns.

In another sense, though, Final Fantasy II was notable for abandoning a blank slate rather than embracing it. For the first time, its players were given a pre-made party full of pre-made personalities to guide rather than being allowed to roll their own. Although they could rename the characters if they were absolutely determined to do so — this ability would be retained as a sort of vestigial feature as late as Final Fantasy VII — they were otherwise set in stone, the better to serve the needs of the set-piece story Sakaguchi wanted to tell. This approach, which many players of Western RPGs did and still do regard as a betrayal of one of the core promises of the genre, would become commonplace in JRPGs. Few contrasts illustrate so perfectly the growing divide between these two visions of the RPG: the one open-ended and player-driven, sometimes to a fault; the other tightly scripted and story-driven, again sometimes to a fault. In a Western RPG, you write a story for yourself; in a JRPG, you live a story that someone else has already written for you.

Consider, for example, the two lineage’s handling of mortality. If one of your characters dies in battle in a Western RPG, it might be difficult and expensive, or in some cases impossible, to restore her to life; in this case, you either revert to an earlier saved state or you just accept her death as another part of the story you’re writing and move on to the next chapter with an appropriately heavy heart. In a JRPG, on the other hand, death in battle is never final; it’s almost always easy to bring a character who gets beat down to zero hit points back to life. What are truly fatal, however, are pre-scripted deaths, the ones the writers have deemed necessary for storytelling purposes. Final Fantasy II already contained the first of these; years later, Final Fantasy VII would be host to the most famous of them all, a death so shocking that you just have to call it that scene and everyone who has ever played the game will immediately know what you’re talking about. To steal a phrase from Graham Nelson, the narrative always trumps the crossword in JRPGs; they happily override their gameplay mechanics whenever the story they wish to tell demands it, creating an artistic and systemic discontinuity that’s enough to make Aristotle roll over in his grave. Yet a huge global audience of players are not bothered at all by it — not if the story is good enough.

But we’ve gotten somewhat ahead of ourselves; the evolution of the 1980s JRPG toward the modern-day template came in fits and starts rather than a linear progression. Final Fantasy III, which was released in 1990, actually returned to a player-generated party, and yet the market failed to punish it for its conservatism. Far from it: it sold 1.4 million copies.

Final Fantasy IV, on the other hand, chose to double down on the innovations Final Fantasy II had deployed, and sold in about the same numbers as Final Fantasy III. Released in July of 1991, it provided you with not just a single pre-made party but an array of characters who moved in and out of your control as the needs of the plot dictated, thereby setting yet another longstanding precedent for the series going forward. Ditto the nature of the plot, which leaned into shades of gray as never before. Chris Kohler:

The story deals with mature themes and complex characters. In Final Fantasy II, the squeaky-clean main characters were attacked by purely evil dark knights; here, our main character is a dark knight struggling with his position, paid to kill innocents, trying to reconcile loyalty to his kingdom with his sense of right and wrong. He is involved in a sexual relationship. His final mission for the king turns out to be a mass murder: the “phantom monsters” are really just a town of peaceful humans whose magic the corrupt king has deemed dangerous. (Note the heavy political overtones.)

Among Western RPGs, only the more recent Ultima games had dared to deviate so markedly from the absolute-good-versus-absolute-evil tales of everyday heroic fantasy. (In fact, the plot of Final Fantasy IV bears a lot of similarities to that of Ultima V…)

Ever since Final Fantasy IV, the series has been filled with an inordinate number of moody young James Deans and long-suffering Natalie Woods who love them.

Final Fantasy IV was also notable for introducing an “active-time battle system,” a hybrid between the turn-based systems the series had previously employed and real-time combat, designed to provide some of the excitement of the latter without completely sacrificing the tactical affordances of the former. (In a nutshell, if you spend too long deciding what to do when it’s your turn, the enemies will jump in and take another turn of their own while you dilly-dally.) It too would remain a staple of the franchise for many installments to come.

Final Fantasy V, which was released in December of 1992, was like Final Fantasy III something of a placeholder or even a retrenchment, dialing back on several of the fourth game’s innovations. It sold almost 2.5 million copies.

Both the fourth and fifth games had been made for the Super Famicom, Nintendo’s 16-bit successor to its first console, and sported correspondingly improved production values. But most JRPG fans agree that it was with the sixth game — the last for the Super Famicom — that all the pieces finally came together into a truly friction-less whole. Indeed, a substantial and vocal minority will tell you that Final Fantasy VI rather than its immediate successor is the best Final Fantasy ever, balanced perfectly between where the series had been and where it was going.

Final Fantasy VI abandoned conventional epic-fantasy settings for a steampunk milieu out of Jules Verne. As we’ll see in a later article, Final Fantasy VII‘s setting would deviate even more from the norm. This creative restlessness is one of the series’s best traits, standing it in good stead in comparison to the glut of nearly indistinguishably Tolkienesque Western RPGs of the 1980s and 1990s.

From its ominous opening-credits sequence on, Final Fantasy VI strained for a gravitas that no previous JRPG had approached, and arguably succeeded in achieving it at least intermittently. It played out on a scale that had never been seen before; by the end of the game, more than a dozen separate characters had moved in and out of your party. Chris Kohler identifies the game’s main theme as “love in all its forms — romantic love, parental love, sibling love, and platonic love. Sakaguchi asks the player, what is love and where can we find it?”

Before that scene in Final Fantasy VII, Hironobu Sakaguchi served up a shocker of equal magnitude in Final Fantasy VI. Halfway through the game, the bad guys win despite your best efforts and the world effectively ends, leaving your party wandering through a post-apocalyptic World of Ruin like the characters in a Harlan Ellison story. The effect this had on some players’ emotions could verge on traumatizing — heady stuff for a videogame on a console still best known worldwide as the cuddly home of Super Mario. For many of its young players, Final Fantasy VI was their first close encounter on their own recognizance — i.e., outside of compulsory school assignments — with the sort of literature that attempts to move beyond tropes to truly, thoughtfully engage with the human condition.

It’s easy for an old, reasonably well-read guy like me to mock Final Fantasy VI‘s highfalutin aspirations, given that they’re stuffed into a game that still resolves at the granular level into bobble-headed figures fighting cartoon monsters. And it’s equally easy to scoff at the heavy-handed emotional manipulation that has always been part and parcel of the JRPG; subtle the sub-genre most definitely is not. Nonetheless, meaningful literature is where you find it, and the empathy it engenders can only be welcomed in a world in desperate need of it. Whatever else you can say about Final Fantasy and most of its JRPG cousins, the messages these games convey are generally noble ones, about friendship, loyalty, and the necessity of trying to do the right thing in hard situations, even when it isn’t so easy to even figure out what the right thing is. While these messages are accompanied by plenty of violence in the abstract, it is indeed abstracted — highly stylized and, what with the bifurcation between game and story that is so prevalent in the sub-genre, often oddly divorced from the games’ core themes.

Released in April of 1994, Final Fantasy VI sold 2.6 million copies in Japan. By this point the domestic popularity of the Final Fantasy franchise as a whole was rivaled only by that of Super Mario and Dragon Quest; two of the three biggest gaming franchises in Japan, that is to say, were dorakue games. In the Western world, however, the picture was quite different.

In the United States, the first-generation Nintendo Famicom was known as the Nintendo Entertainment System, the juggernaut of a console that rescued videogames in the eyes of the wider culture from the status of a brief-lived fad to that of a long-lived entertainment staple, on par with movies in terms of economics if not cachet. Yet JRPGs weren’t a part of that initial success story. The first example of the breed didn’t even reach American shores until 1989. It was, appropriately enough, the original Dragon Quest, the game that had started it all in Japan; it was renamed Dragon Warrior for the American market, due to a conflict with an old American tabletop RPG by the name of Dragonquest whose trademarks had been acquired by the notoriously litigious TSR of Dungeons & Dragons fame. Enix did make some efforts to modernize the game, such as replacing the password-based saving system with a battery that let you save your state to the cartridge itself. (This same method had been adopted by Final Fantasy and most other post-Dragon Quest JRPGs on the Japanese market as well.) But American console gamers had no real frame of reference for Dragon Warrior, and even the marketing geniuses of Nintendo, which published the game itself in North America, struggled to provide them one. With cartridges piling up in Stateside warehouses, they were reduced to giving away hundreds of thousands of copies of Dragon Warrior to the subscribers of Nintendo Power magazine. For some of these, the game came as a revelation seven years before Final Fantasy VII; for most, it was an inscrutable curiosity that was quickly tossed aside.

Final Fantasy I, on the other hand, received a more encouraging reception in the United States when it reached there in 1990: it sold 700,000 copies, 300,000 more than it had managed in Japan. Nevertheless, with the 8-bit Nintendo console reaching the end of its lifespan, Square didn’t bother to export the next two games in the series. It did export Final Fantasy IV for the Super Famicom — or rather the Super Nintendo Entertainment System, as it was known in the West. The results were disappointing in light of the previous game’s reception, so much so that Square didn’t export Final Fantasy V.[5]Square did release a few spinoff games under the Final Fantasy label in the United States and Europe as another way of testing the Western market: Final Fantasy Legend and Final Fantasy Adventure for the Nintendo Game Boy handheld console, and Final Fantasy: Mystic Quest for the Super Nintendo. Although none of them were huge sellers, the Game Boy titles in particular have their fans even today. This habit of skipping over parts of the series led to a confusing state of affairs whereby the American Final Fantasy II was the Japanese Final Fantasy IV and the American Final Fantasy III was the Japanese Final Fantasy VI. The latter game shifted barely one-fourth as many copies in the three-times larger American marketplace as it had in Japan — not disastrous numbers, but still less than the first Final Fantasy had managed.

The heart of the problem was translation, in both the literal sense of the words on the screen and a broader cultural sense. Believing with some justification that the early American consoles from Atari and others had been undone by a glut of substandard product, Nintendo had long made a science out of the polishing of gameplay, demanding that every prospective release survive an unrelenting testing gauntlet before it was granted the “Nintendo Seal of Quality” and approved for sale. But the company had no experience or expertise in polishing text to a similar degree. In most cases, this didn’t matter; most Nintendo games contained very little text anyway. But RPGs were the exception. The increasingly intricate story lines which JRPGs were embracing by the early 1990s demanded good translations by native speakers. What many of them actually got was something very different, leaving even those American gamers who wanted to fall in love baffled by the Japanese-English-dictionary-derived word salads they saw before them. And then, too, many of the games’ cultural concerns and references were distinctly Japanese, such that even a perfect translation might have left Americans confused. It was, one might say, the Blade of Cuisinart problem in reverse.

To be sure, there were Americans who found all of the barriers to entry into these deeply foreign worlds to be more bracing than intimidating, who took on the challenge of meeting the games on their own terms, often emerging with a lifelong passion for all things Japanese. At this stage, though, they were the distinct minority. In Japan and the United States alike, the conventional wisdom through the mid-1990s was that JRPGs didn’t and couldn’t sell well overseas; this was regarded as a fact of life as fundamental as the vagaries of climate. (Thanks to this belief, none of the mainline Final Fantasy games to date had been released in Europe at all.) It would take Final Fantasy VII and a dramatic, controversial switch of platforms on the part of Square to change that. But once those things happened… look out. The JRPG would conquer the world yet.


Where to Get It: Remastered and newly translated versions of the Japanese Final Fantasy I, II, III, IV, V, and VI are available on Steam. The Dragon Quest series has been converted to iOS and Android apps, just a search away on the Apple and Google stores.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: the books Pure Invention: How Japan Made the Modern World by Matt Alt, Power-Up: How Japanese Video Games Gave the World an Extra Life by Chris Kohler, Fight, Magic, Items: The History of Final Fantasy, Dragon Quest, and the Rise of Japanese RPGs in the West by Aidan Moher, and Atari to Zelda: Japan’s Videogames in Global Contexts by Mia Consalvo. GameFan of September 1997; Retro Gamer 69, 108, and 170; Computer Gaming World of September 1985 and December 1992.

Online sources include Polygon‘s authoritative Final Fantasy 7: An Oral History”; “The Long Life of the Original Wizardry by guest poster Alex on The CRPG Addict blog; Wizardry: Japanese Franchise Outlook” by Sam Derboo at Hardcore Gaming 101, plus an interview Robert Woodhead, conducted by Jared Petty at the same site; Wizardry‘s Wild Ride from West to East” at VentureBeat; “The Secret History of AnimEigo” at that company’s homepage; Robert Woodhead’s slides from a presentation at the 2022 KansasFest Apple II convention; a post on tabletop Wizardry at the Japanese Tabletop RPG blog; and Dragon Warrior: Aging Disgracefully” by Nick Simberg at (the now-defunct) DamnLag.

Footnotes

Footnotes
1 Adams was not an entirely disinterested observer. He was already working with Robert Woodhead on Wizardry IV, and had in fact accompanied him to Japan in this capacity.
2 A man with an international perspective if ever there was one, Rogers would later go on to fame and fortune as the man who brought Tetris out of the Soviet Union.
3 It would be briefly revived for one final game, the appropriately named Wizardry 8, in 2001.
4 In another unexpected link between East and West, one of his most important assistants became Nasir Gebelli, an Iranian who had fled his country’s revolution for the United States in 1979 and become a game-programming rock star on the Apple II. After the heyday of the lone-wolf bedroom auteur began to fade there, Doug Carlston, the head of Brøderbund, brokered a job for him with his friends in Japan. There he maximized the Famicom’s potential in the same way he had that of the Apple II, despite not speaking a word of Japanese when he arrived. (“We’d go to a restaurant and no matter what he’d order — spaghetti or eggs — they’d always bring out steak,” Sakaguchi laughs.) Gebelli would program the first three Final Fantasy games almost all by himself.
5 Square did release a few spinoff games under the Final Fantasy label in the United States and Europe as another way of testing the Western market: Final Fantasy Legend and Final Fantasy Adventure for the Nintendo Game Boy handheld console, and Final Fantasy: Mystic Quest for the Super Nintendo. Although none of them were huge sellers, the Game Boy titles in particular have their fans even today.
 
65 Comments

Posted by on November 17, 2023 in Digital Antiquaria, Interactive Fiction

 

Tags: , , , ,

A Digital Pornutopia, Part 1: The Seedy-ROM Revolution

Fair warning: although there’s no nudity in the pictures below, the text of this article does contain frank descriptions of the human anatomy and sexual acts.

If I’m showing people what a CD-ROM can do, and I try to show them the Kennedy assassination, their eyes glaze over. But if I show them an adult title, they perk right up.

— John Williams of Sierra On-Line, 1995

As long as humans have had technology, they’ve been using it for titillation. A quarter of a million years ago, they were carving female figurines with exaggerated breasts, buttocks, and vulvae out of stone. Well before they started using fired clay to make pottery for the storage of food, they were using the same material to make better versions of these “Venus figurines,” some of them so explicit that the archaeology textbooks of the late nineteenth and early twentieth centuries didn’t dare to reproduce or even describe them. And then as soon as humans had pigments and dyes to hand, they used them to paint dirty pictures on the walls of their caves.

Immediately after the world’s first known system of writing came to be in Mesopotamia, the people of that region began using it for naughty stories. (In one of them, a new wife tells her husband to place his hand in her “goodly place,” promising to compensate him in kind thereafter: “Let me caress you. My precious caress is more savory than honey.”) The ancient Greeks covered their household and decorative items with sexual imagery. And the Romans too were enthusiastic and uninhibited pornographers, as evidenced by the famous erotic wall frescoes inside Pompeii’s brothel.

Things changed somewhat in Europe with the rise of Christianity, a religion which, unlike the pagan belief systems that preceded it, framed the questions surrounding sex — whether you did it, with whom you did it, how you did it, even how and how much you thought about doing it — as issues of intense spiritual significance. “To be carnally minded is death,” wrote Saint Paul.

Yet the vicarious desires of the flesh couldn’t be quelled even by the prospect of an afterlife of eternal torment. Porn simply went underground, where it has remained to a large extent to this day. The Decameron of Boccaccio and The Canterbury Tales by Geoffrey Chaucer, two of the most celebrated examples of Medieval literature, preserve some of the bawdy tales of promiscuous priests and nubile nuns that the largely illiterate peasantry told one another when they gathered in taverns out of earshot of their social betters.

The Gutenberg Bible of the fifteenth century was followed closely by less rarefied printed books, with titles like The Errant Prostitute. In the seventeenth century, the renowned English diarist Samuel Pepys wrote ashamedly of how he had found a copy of a book called The Girls’ School in a second-hand store, and spent so much time perusing its pages whilst, er, indulging himself that he finally felt compelled to burn “that idle, roguish book.” The book generally considered the first true English novel, Samuel Richardson’s Pamela, appeared in 1740; John Cleland’s Fanny Hill: Memoirs of a Woman of Pleasure, the first English erotic novel, was published just eight years later. It spawned a lively market for written erotica, the consumption of which, followed perhaps by a visit to a brothel or two, became the Victorian Age’s version of sex education for gentlemen.

And then along came photography. Some of the first photographs ever taken were of nude women and cavorting couples; traveling circuses sold them from under the table to furtive men while their wives and children were off buying sweetmeats. Improved photographic techniques, combined with cheap “pulp” paper and cheaper printing technologies led to the first mass-market “skin” magazine in 1931: The Nudist, a publication of the American Sunbathing Association, who knew perfectly well that most of their audience wasn’t buying the magazine for tips on how best to soak up the rays. Throughout modern times, pornographers have often couched titillation under just such a veneer of “educational value.” And another, even more enduring truth of modern porn was also on display in The Nudist‘s pages: the camera lens focused most lovingly on the women. The producers and consumers of visual pornography have always been mostly men, although the reasons why this has been the case are matters for debate among psychologists, anthropologists, and sociologists. (The opposite is largely the case for textual erotica in the post-photographic age, for whatever that’s worth.)

By the time The Nudist made the scene, still images of nakedness already had serious competition. Back in 1896, Thomas Edison had released a movie called The Kiss to widespread outrage. “They get ready to kiss, begin to kiss, and kiss and kiss and kiss in a way that brings down the house every time,” ran the description in Edison’s catalog, rather overselling the thrill of an 18-second movie that ends with little more than a chaste series of henpecks. But never fear, matters quickly escalated from there. The first known full-fledged porn film appeared in 1908. A synopsis of the action shows that, when it comes to porn as so many other things, those wise words from the Book of Ecclesiastes (“The thing that hath been, it is that which shall be; and that which is done is that which shall be done; and there is no new thing under the sun”) ring true: “A woman was seen pleasuring herself with a dildo, then another man and woman joined her for a threesome, involving lots of oral sex before intercourse.” By the 1920s, a full-fledged “blue” film industry was thriving in Hollywood alongside the more respectable one, using many of the same sets and in some cases even the same performers. The arrival of the Hays Code in 1933 put a damper on some of the fun, but the blue movies never went away, just slunk further underground. During the Second World War, the American military turned a blind eye to the “stag films” that infiltrated its ranks at every level; those who made them said it was their patriotic duty to help the boys in uniform through all those lonely nights in the barracks.

The sexual revolution of the 1960s brought big changes to movies, both above- and below-ground. Suddenly things that hadn’t been seen in respectable fare since 1933 — bare female breasts and sex scenes prominent among them — became commonplace in even the most conservative Middle American movie houses. At the same time, the arrival of cheap Super 8 cameras and projectors combined with the general loosening of social mores to yield what some connoisseurs still call the “golden age” of porn. To distinguish themselves from a less prudish mainstream-film industry, the purveyors of porn pushed boundaries of their own, into every imaginable combination of people and occasionally animals, with ever tighter shots of the actual equipment in action, as it were.

By 1970, there were more than 750 porn-only movie theaters in the United States alone. In 1973, a pornographic comedy called Deep Throat, a timeless tale of a woman born with her clitoris on the wrong end of her torso, became an international sensation, shown even by many “legitimate” theaters. Made for $25,000, it eventually grossed $100 million, enough to make it far and away the most successful film of all time when measured in terms of return on investment.

At the time, Deep Throat was widely heralded as the future of porn, but the era of mainstream “porno chic” which it ushered in proved short-lived. Instead of yielding more blockbusters in the style of respectable Hollywood, the porn industry that came after was distinguished by the sheer quantity of material it rolled out for every taste and kink. This explosion was enabled by the arrival of videotape in 1975; shooting direct to video was cheaper by almost an order of magnitude than even Super 8 film. But even more importantly, videotape players for the home gave the porn hounds and would-be porn hounds of the world a way to consume movies in privacy, thereby giving the porn industry access to millions upon millions of upstanding pillars of their communities who would never have frequented sticky-seated cinemas. Every porno kingpin in the world redoubled his production efforts in response, even as they all rushed en masse to move their libraries of “classics” onto videotape as well. The gold rush that followed made Deep Throat look like the flash in the pan it was. In 1978 and 1979, three quarters of all home movies sold in the United States were porn. That percentage inevitably dropped as the technology went more mainstream in the 1980s, but the raw numbers remained huge. Even in the late 1990s, Americans were still renting $4.2 billion worth of porn on videotape each year.

Advances in voice communication as well were co-opted by porn. The party lines of the 1960s and the pay-per-call services that sprang up in the 1980s were quickly taken over by it: respectively, by people wanting to talk dirty to one another and by people willing to pay a professional to talk dirty to them.

Just why is porn so perpetually on the technological cutting edge, both driving and being driven by each new invention that comes around? Perhaps it’s down to its fundamentally aspirational nature. While voyeurism for the sake of it certainly has a place on the list of human fetishes, for most of its consumers porn is a substitute — by definition a less than ideal one — for something they’re not getting, whether just in that precise moment or in their current life in general. So, they’re always looking for ways to get closer to the real thing, as Walter Kendrick, the author of a book-length history of porn, told The New York Times back in 1994.

Pornography is always unsatisfied. It’s always a substitute for the contact between two bodies, so there’s a drive behind it that doesn’t exist in other genres. Pornographers have been the most inventive and resourceful users of whatever medium comes along because they and their audience have always wanted innovations. Pornographers are excluded from the mainstream channels, so they look around for something new, and the audience has a desire to try any innovation that gives them greater realism or immediacy.

If you look at the history of pornography and new technologies, the track record has been pretty good. Usually everyone has come out ahead. The pornography people have gotten what they want, which is a more vivid way to portray sex. And the technology has benefited from their experimentation. The need for innovation in pornography is so great that it usually gets to a medium first and finds out what can be done and what can’t.

If early computers are not as strong an example of this phenomenon as photography, film, and videotape, that perhaps speaks more to the nature and limitations of that particular technology than it does to any lack of abstract interest in using computers for getting off. Yet sex definitely wasn’t absent from the picture even here. I’ve written elsewhere on this site of how the men who worked at Bell Labs in the 1960s contrived ways to put naked women on their monitor screens using grids of alphanumeric characters in lieu of a proper bitmap, of how Scott Adams sold games of Strip Dice alongside his iconic early adventure games, and of how Sierra On-Line once made a mint with the naughty text adventure Softporn, about a loser who wishes he was a swinger trying to navigate the dating scene of the late disco era.

Softporn may have had a name that no big publisher like Sierra would have dared use much after its 1981 release date, but in other ways it became the template for the mainstream games industry’s handling of sex. The rule was that a little bit of sexiness was okay if it was presented in the context of comedy. Infocom’s Leather Goddesses of Phobos went this way to significant success in 1986, helping to convince Sierra to revisit the basic premise of Softporn in the graphical adventure Leisure Suit Larry in the Land of the Lounge Lizards the following year. Larry starred in a whole series of games after his first one proved a huge hit, becoming the poster child for risqué computer gaming. His success in turn brought more games of a similar style from other publishers, from Accolade’s Les Manley (yes, really) series to Legend’s Spellcasting series. The players of such games walked a fine line between being vicariously titillated alongside their protagonists and laughing at them for being the losers they were.

Still, even the Church Lady would have blanched at calling such games porn; they were in fact no more explicit than the typical PG-13-rated frat-boy movie. To find pictures of actual naked women on computers, you would have to download them — slowly! — from a BBS, or buy a still less respectable game like Artwox’s Strip Poker via mail order.[1]A few years ago, I went to a retro-gaming exhibition here in Denmark with a friend of mine. We got to talking with a woman who worked at the museum hosting it, who told us of her own memories from those times. It seems her brother had managed to acquire a copy of Strip Poker for his Commodore Amiga. Being a lover of card games, she tried to play it, but all the stupid pictures kept getting in the way. “I just wanted to play poker!” she lamented. I suspect this story may have something to tell us about the differences between the genders, but I have no idea what it is. Few distributors or retailers would risk handling such titles, what with the mainstream culture’s general impression that any and all forms of digital entertainment were inherently kids’ stuff.

But by 1993 or so, that impression was at last beginning to change. The arrival of mouse-driven user interfaces, decent graphics and sound, and CD-ROM had made “multimedia” one of the buzzwords of the zeitgeist. Part and parcel of the multimedia boom was a new generation of “interactive movies,” which featured real human actors playing their roles according to your instructions. Computer games, went the conventional wisdom, were growing up to become sophisticated adult entertainment — potentially in all senses of the word “adult.” Just like the makers of conventional movies during the 1960s, the makers of these interactive movies were eager to push the boundaries, to explore previously taboo subjects and see how much they could get away with. They had to be careful; 1993 was also the year of an overheated Senate hearing on videogame content, partially prompted by an early interactive movie for the Sega Genesis console called Night Trap. But be that as it may, many of the people behind interactive movies believed strongly in the principle that they should be allowed to show anything that a non-porn traditional film might, as long as, like such a film, they played with an open hand, disclosing the nature of the content to possible buyers upfront.

The pioneer of the risqué — but not pornographic — interactive movie was a game called Voyeur, released in 1993 for the Philips CD-i, a multimedia set-top box for the living room — don’t call it a games console! — that was marketed primarily to adults. The following year, Voyeur made it to ordinary computers as well.

Masterminded by David Riordan, whose earlier attempt at an interactive movie It Came from the Desert has aged much better, Voyeur casts you as a private dick surveilling a potential candidate for the American presidency, a fellow who makes Gary Hart and Bill Clinton look like choirboys. Everything from infidelity to incest, from sado-masochism to an eventual murder is going on in his house. But, like so many productions of this ilk, the game is caught between the urge to titillate and the fear of going too far and getting itself blacklisted. The end result is a weird mixture of the provocative and the prudish, as inimitably described by Charles Ardai, the most entertaining reviewer ever to write for Computer Gaming World magazine.

For those connoisseurs of striptease who prefer the tease to the strip, Voyeur should be a source of endless delight. Women are forever unfastening their bra straps in this game, or opening their towels while conveniently facing away from the camera, or walking around in unbuttoned vests that don’t quite reveal what you think they’re going to, or leaning toward each other for lesbian kisses that somehow never get completed. Men have it worse in some ways: they get led around in bondage collars, handcuffed to bedposts, and violently groped by their sisters. No one actually manages to have sex, though; all they do is go around interrupting each other. No wonder that after several hours of this someone ends up murdered.

Voyeur.

Coming a year after a panties-less Sharon Stone had shocked movie-goers in that scene in Basic Instinct, in a time when even network television shows like NYPD Blue were beginning to flirt with nudity, Voyeur was pretty tame stuff, notable only for existing on a computer. There was very little actual game to Voyeur, even by comparison to most interactive movies; as the player, your choices were limited to deciding which window to peep into at any given time. Yet it attracted a storm of press coverage and sold very well by the standards of the time — enough to prompt a belated 1996 sequel that was if anything even more problematic as a game and that didn’t sell anywhere near as well.

In fact, only one other game of this stripe can be called an outright commercial success. That game was Sierra’s 1995 release Phantasmagoria, designed against type by Roberta Williams, best known as the creator of the family-friendly King’s Quest series. Having written about it at some length in another article, I won’t repeat myself here. Suffice to say that Phantasmagoria became Sierra’s best-selling game to date, despite or because of content that was not quite as transgressive as the game’s advertising made it appear. In his memoir, Sierra’s co-founder (and Roberta’s husband) Ken Williams speaks of the curious dance — two steps forward, one step back — that interactive movies like this one were constantly engaged in.

There was a scene in Phantasmagoria which was shot with Victoria Morsell, the heroine of the story, topless. Roberta wrote the scene and helped direct it. If it were a horror film no one would have thought anything about it, other than giving the film an “R” rating. But this was a videogame, and the market hadn’t fully realized yet that interactive stories weren’t always just for kids. When it came time to release the game, we edited [it] to only show some side-boob. Even with only a hint of nudity, Phantasmagoria was not a game we felt was appropriate for children. [We] voluntarily self-rated the product with a large “M” for Mature.

But Ken Williams must have believed the market was evolving quickly, because just a year after the first Phantasmagoria Sierra released a sequel in name only — it had nothing to do with the characters or story of the first — called Phantasmagoria: A Puzzle of Flesh, which proudly strutted all of the stuff that its predecessor had only hinted at. Designed by Lorelei Shannon rather than Roberta Williams, A Puzzle of Flesh was as sexually explicit as any mainstream game would get during the 1990s, an interactive exploitation flick steeped in sado-masochism, bondage, and the bare boobies that had been so conspicuously lacking from Roberta’s game. Like too many Sierra adventure games, its writing and design both left something to be desired. And yet it was an important experiment in its way, demonstrating to everyone who might have been contemplating making a game like it that there really were boundaries which they would be well advised not to cross. In contrast to many organs of game journalism, Computer Gaming World did deign to give A Puzzle of Flesh a review, but said review oozes disgust: “Playing this game, if one can grace this morally reprehensible product with such happy terms as ‘play’ and ‘game,’ is extremely unpleasant; to do so ‘for fun’ requires a fascination with hardcore schlock or a hardened attitude toward horror and exploitive erotica. You have been duly warned.”

Phantasmagoria: A Puzzle of Flesh.

It speaks to one of the peculiarities of American culture that the magazine could work itself into such a lather of moral outrage over this game whilst praising the booby-less ultra-violence of HyperBlade (“The 3D Battlesport of the Future!”) in the very same issue. (“Fractured skull. Severed bronchial artery. Shattered tibia. This will eventually come as music to your ears. Want to cut an opponent’s head off and throw it in the goal? Pretty brazen, but that’s okay too.”) But such were the conditions on the ground, and publishers had to learn to live with them. Sierra never made another full-fledged interactive movie after A Puzzle of Flesh. The genre as a whole slowly died as the limitations of games made from spliced-together pieces of canned video became clear to even the most casual players. The dream of a games industry that regularly delivered R-rated content in terms of sex as well as violence blew away like so much chaff on the breeze alongside the rest of the interactive-movie fad. Mind you, games would continue to be full of big-breasted, scantily-clad women to feed the male gaze, but gamers wouldn’t get to see them in action in the bedroom.

Yet this isn’t to say that there was no sex whatsoever to be had on CD-ROM. Far from it. Even as the above-ground industry’s fad for interactive movies was swelling up and then petering out, there was plenty of explicit sex available for consumption on “seedy ROM,” the name a wag writing for The New York Times coined for the underground genre. A cottage industry sprang up around the technology of CD-ROM when it was still in its infancy — an industry which it would be disarmingly easy for a historian like me, trolling through my old issues of Computer Gaming World and the like, to never realize ever existed. This has always been the way with porn; despite its eternal popularity, it lives in the shadows, segregated from other forms of media whose consumers are less ashamed of their habit. It is forced to exist in its own parallel universe of distribution and sales — and yet people who are determined to see it always find a way to meet it where it lives. This was as true of porn during the multimedia-computing boom as it has been in every other technological era and context.

The pioneer of the seedy-ROM field, coming already in 1990 — fully three years before Voyeur, at a time when the standards around CD-ROM had barely been set and most people had not yet even heard of it — was a product of a tiny company called Reactor, Incorporated. It called itself Virtual Valerie. Implemented using Apple’s HyperCard authoring system, it plays a bit like The Manhole with sex; you can explore your girlfriend Valerie’s apartment, discovering a surprising number of secret paths and Easter eggs therein, or you can spend your time exploring Valerie herself. Relying on hand-drawn pixel graphics rather than digitized photographs or video, it’s whimsical in personality — almost innocent by contrast with what would come later — but it nevertheless demonstrated that there was an eager market for this sort of thing. “A left-handed mouse designed for use by right-handed people will be a hot seller,” wrote Anne Gregor only partly jokingly in CD-ROM Today, one of the few glossy magazines that dared to cover this space at all.

Virtual Valerie.

Virtual Valerie was the canary in the coal mine. In her wake, dozens of companies plunged into making and selling porn on CD-ROM, smelling an opportunity akin to the porn-on-videotape boom of the recent past. They sold disks full of murky images downloaded from Usenet, where a lively trade in porn went on alongside lively discussions. They sold old porn movies on CDs, the better for gentlemen to view on the computer in the home office, away from the prying eyes of the wife and kids. (The Voyager Company’s version of the Beatles film A Hard Day’s Night was promoted in 1993 as the very first full-length movie on CD-ROM, but it actually wasn’t this at all: it was rather the first non-pornographic movie on CD-ROM.) And then there were the more ambitious, genuinely interactive efforts that followed in the footsteps of Virtual Valerie. “Virtual girlfriends” became a veritable sub-genre unto itself for a while. Girlfriend Teri, for example, claimed a vocabulary of 3000 words. Journalist Nancie S. Martin called her perfect for men who preferred “a woman you can talk to, who, unlike most real women, has no opinions of her own. While she can’t discuss Proust, she can say, ‘It’s so big!’ on demand.”

It was proof of a longstanding axiom in media: no matter how daunting the obstacles to distribution, porn will out. When American CD-duplication houses refused their business, the purveyors of seedy ROMs found alternatives in Canada. When no existing software distributor or retailer was willing to touch their products, when the big computer and gaming magazines too rejected them or allowed them only small advertisements sequestered in their back pages, they advertised in the skin magazines and urban newspapers instead, and sealed the deal through mail order and through porn-rental shops that stocked their CDs alongside the usual videotapes. Those who joined this latest porn gold rush were a variegated lot, ranging from the likes of Playboy and Penthouse, whose libraries of photographs stretched back decades, to hardcore pornography pimpers like Vivid Media with catalogs of their own that were equally ripe for re-purposing, to fresh faces who were excited about the overall potential of multimedia, and saw porn as a way to make money from it or to nudge it along as a consumer-facing technology, or both.

One of these newcomers to porn was New Machine Publishing, founded by one Larry Miller and two other 25-year-old men in 1992. “At first, we thought we’d do a CD-ROM on the rain forest,” Miller told The Los Angeles Times three years later. “It was gonna be interactive, have bird calls, native music, all that stuff. Then we discovered we were not thinking in real-world terms. No one would have bought it.” Instead they acquired a bunch of footage from a local porno-production company and stuck it on a CD alongside a frame story not that far away in spirit from Voyeur: you were a guest at a hotel where all the rooms were wired for video and sound, and you were in the catbird seat in the control room. The difference, of course, was that their “interactive porn movie,” which they called Nightwatch, didn’t shirk from the money shots. They took 500 Nightwatch discs to a Macworld show in San Francisco and sold every single one of them, at $70 a pop. Attendees “would cruise by,” remembered Miller. “Then they would get to the end of the aisle and do a U-turn.”

It turned out that porn was immune to the infelicities of this early era of digital video that caused customers to turn away from more upstanding multimedia productions. While few could convince themselves to overlook the fact that Voyager’s A Hard Day’s Night played in a resolution more suited to a postage stamp than a computer monitor, they were happy to peer at Nightwatch‘s even blearier clips, jittering before them enigmatically at all of five frames per second. New Machine plowed their Nightwatch profits into The Interactive Adventures of Seymore Butts (yes, really), which did for Leisure Suit Larry what their previous seedy ROM had done for Voyeur. Everyone knew the drill of this sort of scenario by now: another loser protagonist out on the town trying to score. Except that Seymore would get lucky in a way that Larry or Les and their players could hardly have begun to imagine.

Seymore Butts, ladies man, on the prowl.

Indeed, New Machine was now able to pay for their own film shoots. Robert A. Jones of The Los Angeles Times was permitted to visit the set one day, and returned to bear witness to the ultimate hollowness of mechanistic sex without emotion or context: “Watching it unfold, it’s hard to believe anything of redeeming value could be salvaged from this scene of bored sordidness.” There has always been money to be made in some people’s compulsion to keep watching porn long after the novelty is gone. Why should seedy ROMs be any different?

But, you might be asking, how much money are we talking about here? Alas, that’s a tough question to answer with any degree of certainty. Because porn lives underground, both abhorring the light of mainstream attention and being likewise abhorred, it has always been notoriously difficult to figure out how much money people are actually making from the stuff. Porn on CD-ROM is no exception. That said, the sheer quantity of it that was out there by the middle of the 1990s indicates that real money was being made by at least some of its providers. To be sure, no one was selling discs in the quantities of Voyeur or Phantasmagoria. But then again, they didn’t have to, thanks to vastly cheaper production budgets and the ability to charge a premium price. This latter has always been one of the advantages of peddling smut: most people just want to take their porn and disappear back into blessed anonymity, not haggle and bargain hunt.

We do have some data points. The founders of New Machine Publishing alluded vaguely to selling “tens of thousands” of copies of Seymore Butts, more than many a more wholesome point-and-click adventure game of the era. The first Penthouse Virtual Photo Shoot disc — “Be the photographer!” — reportedly sold 30,000 copies in its first three months. (In another testament to its popularity, no fewer than five additional volumes, featuring different sets of girls for the camera’s roving eye to capture, were later released.) Vivid Entertainment’s interactive division reported having four 10,000-plus sellers on CD-ROM already in the spring of 1994, barely six months after its founding. Fay Sharp, who acted as a distributor and liaison between seedy-ROM makers and mom-and-pop porn shops all over the country, claimed that $260 million worth of porn on disc was sold in that same year. Pixis Interactive claimed multiple titles with sales of 50,000 to 100,000 units a couple of years after that. For all that such numbers may still have been a drop in the bucket compared to porn on videotape, they could mean serious money for the people involved, with the tantalizing prospect of a lot more where that had come from in the future, once multimedia personal computers had become as ubiquitous in American homes as videocassette players. In short, and despite the fact that seedy ROMs would never blow up quite as spectacularly as many of the folks involved in them expected — we’ll get to the reasons for that shortly — all signs are that some people in this market did very well for themselves for a while, thanks to sales that in many cases seem to have dwarfed the numbers racked up by, say, The Voyager Company’s lineup of high-brow explorations of history and science.

Perhaps the best testimony to the real if brief-lived success of porn on CD-ROM is a brouhaha that erupted at COMDEX, the buttoned-down computer industry’s biggest trade show, taking place in November of each year in Las Vegas. In 1993, The New York Times reported with considerable surprise that porn had become one of the highlights of the show in the opinion of businessmen who were ostensibly there to investigate decidedly unsexy server racks and backup systems and the like.

“CD-ROM brings unimaginable quantities of knowledge to those for whom information is their most valuable asset,” said Bill Kelly, president of PC Compo Net of La Habra, California, who sells such knowledge bases as L.A. Strippers: Bikes & Babes & Rock ‘n’ Roll. PC Compo Net was among a half dozen companies selling CD-ROM titles that depict graphic sex with video, sound, and still images. The displays caused traffic jams in the aisles as the predominantly male computer crowd stopped to gawk. Many customers waved fists of cash. One harried booth clerk noted that CD-ROM is ideal for circumventing Japanese laws restricting pornographic films and magazines, because the mirror-like disks give no clue as to their contents and can easily be slipped through customs in an audio-CD case.

Some COMDEX exhibitors complained about their X-rated neighbors on moral grounds; others said they were pleased by the crowds. One Japanese computer dealer, who asked that his name not be used, said it appears that pornography may be the long-awaited “killer” application that will spur the sale of CD-ROM drives. “People who develop CD-ROM software should be overjoyed that we’re here, because we’re helping sell CD-ROM drives,” said Lawrence Miller, one of three partners in New Machine Publishing of Santa Monica, California, which makes games with explicit sexual content.

Concerned about their show’s image, the COMDEX organizers moved the seedy-ROM exhibits into a dank corner of the basement the following year — an apt metaphor for pornography’s place in polite society, yes, but one the exhibitors themselves considered less than gracious, given that they were by now contributing about half a million dollars to the show’s bottom line. In return, COMDEX couldn’t even be bothered to keep the power on for them all the time down there in the cellar. Its organizers were deaf to their complaints: “COMDEX is a place where people show creative new products for the benefit of the industry. This is not appropriate.”

So, Fay Sharp, a rare woman in porn in a role other than that of performer, set up her own show in 1995 just a few blocks away from that year’s COMDEX. Called AdultDex, its admission charge was just $20 if you could show a COMDEX ticket. “COMDEX is the classroom and AdultDex is recess,” said William Margold of the Free Speech Coalition, one of the new show’s sponsors. About 7500 people came by that first year, while the bigger show’s organizers huffed and hawed and threatened this unwelcome parasite. In the end, though, there was nothing they could do about it.

AdultDex became a regular event thereafter, drawing upwards of 20,000 attendees some years. The scene there was much the same as that in the countless Vegas strip clubs that catered to the same clientele of business road warriors enjoying a taste of freedom from the strictures of family life. (“COMDEX attendees don’t gamble, but they line up in those gentlemen’s clubs,” said the chairwoman of the tourism and convention department at the University of Nevada. “I’ve been on flights coming into Las Vegas sitting next to hookers who fly in just for that week.”) Journalist Samantha Cole describes AdultDex as “men in polo shirts and cheap brown suits standing too close to tanned women in bikinis and leather, or leaning their five-o’clock shadows on their breasts. One thing that’s never changed through decades of Las Vegas porn conferences: in this gin-scented setting, men and women do seem like separate species.”

In 1997, the police raided AdultDex — at the instigation, many suspected, of the COMDEX organizers — and handed out citations for “lewd and dissolute conduct” and “performing a live sex act.” But, as always, the porn show must go on. “We were on CNN, the local TV; we have even had an editorial in the newspaper that was favorable,” said Fay Sharp to Billboard magazine. “It’s giving us the kind of publicity we could never buy. It’s showing [that] adult interactivity is very much in the mainstream.” AdultDex wouldn’t pass into history until 2003, when it expired, like any good parasite, alongside its host, COMDEX itself.

The scene at AdultDex.

Yet even as Fay Sharp was saying those words in 1997, the nature of the products being peddled at AdultDex was changing markedly. The fact was that the seedy-ROM boom was already past its peak, already starting onto the down slope toward what all those dirty discs have become today: the kitschy detritus of a quaint past, media artifacts which manifestly failed to reach the world-changing heights once predicted for them. Pixis Interactive reported that year that its latest sales figures looked suddenly “ghastly.” The smart folks in the cottage industry were now plotting exit strategies from CD-ROM, which in some cases involved going legit — or at least more legit — in one way or another. The boys behind New Machine, for example, reinvested their Seymore Butts profits into casino games on CD-ROM, then took that concept online. At the turn of the millennium, their virtualvegas.com was one of the most popular online-gaming sites on the Internet.

Meanwhile those digital impresarios who intended to stay in porn were trying to figure out how to execute a dramatic shift in terms of delivery medium. For, just as video had once killed the radio star, the World Wide Web was now all too clearly killing the multimedia CD-ROM, regardless of whether it concerned itself with kinky sex or with the fate of the rain forests. Seedy ROMs wouldn’t disappear because of any shortage of horny boys and men — perish the thought! They would disappear because anything they could do, the Web could do better — cheaper and easier and, best of all, even more anonymously. The fact was that the Web was about to change humanity’s relationship to sex in general to an extent that few of even the most enthusiastic proponents of sex on CD-ROM would have ventured to predict. And, almost equally interestingly, sex was about to change the Web — change not only the sites people surfed to and what they did there, but make the whole place safe for commerce and its filthy lucre in a way that a million wide-eyed idealists like its original inventor Tim Berners-Lee could never have managed.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: the books How Sex Changed the Internet and the Internet Changed Sex by Samantha Cole, Obscene Profits: The Entrepreneurs of Pornography in the Cyber Age by Frederick S. Lane III, The Players Ball: A Genius, a Con Man, and the Secret History of the Internet’s Rise by David Kushner, The Erotic Engine by Patchen Barss, The Pornography Wars: The Past, Present, and Future of America’s Obscene Obsession by Kelsy Burke, and Not All Fairy Tales Have Happy Endings by Ken Williams; The New York Times of November 21 1993, January 9 1994, and October 4 1995; CD-ROM Today of June/July 1994; Electronic Entertainment of August 1994 and August 1995; Wired of July 1995; Computer Gaming World of March 1995, July 1996, and March 1997; Wired of July 1995 and February 1997; The Los Angeles Times of March 19 1995; The San Francisco Chronicle of November 19 1997; frieze of March/April 1996; The Las Vegas Sun of November 19 1996; American Heritage of September/October 2000. Online sources include “Inside AdultDex” by Adi Robertson at The Verge, “COMDEX Trade Show Leaves Vegas” by Chris Jones at Casino City Times, and “History of Sex in Cinema” at filmsite.

Footnotes

Footnotes
1 A few years ago, I went to a retro-gaming exhibition here in Denmark with a friend of mine. We got to talking with a woman who worked at the museum hosting it, who told us of her own memories from those times. It seems her brother had managed to acquire a copy of Strip Poker for his Commodore Amiga. Being a lover of card games, she tried to play it, but all the stupid pictures kept getting in the way. “I just wanted to play poker!” she lamented. I suspect this story may have something to tell us about the differences between the genders, but I have no idea what it is.
 
 

Tags: