RSS

Search results for ‘trinity’

Nine Princes in Amber

Nine Princes in Amber

Technological futurists and the people who love them have been talking for some time now about something called the Singularity, that moment in the (near?) future when computing technology will reach some critical mass and change everything forever in ways we can hardly begin to imagine. I’m not so interested in discussing the merits of the idea here, but I do want to say that singularities can take many forms, and to note that the sort of singularities one sees are perhaps more emblematic of one’s own personal hobby horses than some might like to admit. In that spirit, I’d like to propose a singularity of my own, albeit one recently passed rather than oncoming. It landed right about the middle of the 1960s.

To see what I’m talking about, watch a movie or listen to a hit song from 1960 followed by one from 1970. While it may be extreme and rather narcisstic and certainly horridly Western-centric to divide all recent history into pre-1960s and post-1960s, it’s nevertheless hard for me to come up with another instant when everything changed so completely. Films and songs are of course only signifiers of the deeper changes in the culture: changes in gender roles and responsibilities, in race relations, in attitudes toward war and peace and government and the rights and responsibilities of the citizen. The 1960s changed the way people talked, the way they dressed, they way they thought in a way far more profound than the mere vicissitudes of fashion. Perhaps most of all, they changed what is still for so many the most uncomfortable of uncomfortable subjects, sex, forever. We’re still dealing with the fallout every day: in the United States, at least, your decision of which party to vote for still has a great deal to do with whether you think all of these changes were in general a good or a bad thing.

Even written science fiction, that literary ghetto which had hitherto marched along blissfully ignoring and being ignored by changes in the larger world of arts and letters, wasn’t insulated from these winds of change. A New Wave of writers poured into — the old guard might, and sometimes did, say “invaded” — the stolid old halls that the pulps had built. These new writers were very different from the old holy trinity of Asimov, Clarke, and Heinlein. They replaced an absolute faith in objectivity and rationalism with a tolerance for ambiguity and an honest curiosity about spirituality, particularly (this being the 1960s) of the Eastern variety. They replaced adventures in outer space with explorations (this again being the 1960s, when psychedelics were everywhere) of inner space. They replaced workmanlike (not to say clunky) prose with literary flights of fancy and experimental structures showing the influence of folks like James Joyce and William S. Burroughs; a surprising number of the New Wave stars were poets in addition to short-story writers or novelists, for God’s sake. They replaced characters that served primarily as grist for the mill of Plot and Idea with real, three-dimensional humans whose subjective experiences were the point of the works in which they featured. American science fiction, like seemingly every other institution in the country, went to war with itself for a time, with John W. Campbell opining on behalf of the Old Guard in the pages of Analog that the Kent State protestors had gotten what they deserved while Michael Moorcock preached anarchism and feminism from his soapbox as editor of New Worlds.

Roger Zelazny

Roger Zelazny

One of the biggest stars of the New Wave is our real subject for today: the man with the perfect science-fiction writer’s name of Roger Zelazny. He burst onto the scene in the mid-1960s with a series of dazzling short stories and a short novel, This Immortal, which took place on a post-apocalyptic Earth populated by creatures and minor gods from a sort of fever dream of Greek mythology. Then in 1967 he delivered Lord of Light, an audacious transplantation of the Hindu pantheon — if you haven’t realized it already, Zelazny was big on myth — to an interstellar milieu. The structure was as intricate as many a Modernist novel, the language gorgeous. The central character, Mahasamatman (he “called himself Sam”), reminds one in his rebellion against the rest of the pantheon of no one so much as the Satan of Paradise Lost.

Lord of Light deservedly swept science fiction’s two biggest prizes, the Hugo and the Nebula Awards, for its year. Along with a groundbreaking collection of short stories of the same year edited by Harlan Ellison and to which Zelazny also contributed, Dangerous Visions, it’s gone on to stand as perhaps the perfect exemplar of New Wave science fiction and why it mattered — this even though Zelazny himself rejected the label. There was a moment there when Roger Zelazny was accorded the honor amongst a ridiculously strong field of fellow up-and-comers of being just possibly the most promising young writer in science fiction. Lord of Light was great, but, what with Zelazny still so young, many predicted even better things from him once he matured a bit, got beyond just dazzling with the sheer high-wire virtuosity of his language and plots and began to really dig into his worlds and themes.

But somehow that never quite happened. Oh, he continued to be astonishingly prolific, releasing for instance three novels in 1969 alone. His books remained readable; Zelazny was too professional to deliver anything else. Yet, while the reputation of contemporaries like Ursula Le Guin have only soared higher in the years since the heyday of the New Wave, Zelazny gradually found himself banished to the mid-lists, just another competent and salable genre writer. Much of his later work felt kind of forgettable, at its worst even kind of facile. Maybe it was down to an unwillingness to go to the hard places. Certainly it’s hard not to feel that this writer, who throughout his career cranked out novels at the rate of one or two every year along with a steady stream of short stories, might have benefited from just slowing down a bit, from applying all of his enormous energy to a single book for a while.

On the other hand, lots of readers — more than had enjoyed the likes of Lord of Light, actually — liked the later Zelazny, liked his readable, fast-paced novels that weren’t too demanding on either their reader or their writer. Zelazny, for his part, always rejected aspirations to literature in interviews, making it clear that he considered himself simply a working writer whose first consideration must be the financial. Even Lord of Light, he eventually revealed, had some commercial calculation at its base: he made it straddle the line between science fiction and fantasy in order to maximize its readership. Lovers of Zelazny’s early work could at least console themselves that even his most pedestrian novels still showed flashes of the old brilliance. Anyway, there was still plenty of time for him to buckle down and deliver another masterpiece. Until suddenly there wasn’t: he died of colorectal cancer at age 58 in 1995.

The flash point for lovers and haters of newer Zelazny is a series of ten fantasy novels set in a world called Amber. Drawing upon Zelazny’s usual mythical archetypes as well as Platonic philosophy, the Amber series postulates a perfect shining city on a hill, Amber itself, of which all other reality — or realities; infinite alternate universes worth of them — are but imperfect shadows. As one travels outward from Amber the shadows become steadily wilder and stranger, until one arrives at last at Amber’s polar opposite, the Courts of Chaos. The elemental forces of Order and Chaos which Amber and the Courts respectively represent exist in an uneasy symbiotic state — which doesn’t prevent them from constantly trying to get the upper hand on one another. Within Amber lives a royal family of superhumans and apparent immortals. They can communicate with one another and instantly jump to one another’s locations in Amber or in shadow via a set of magical cards, the family Trumps. They can also, albeit more laboriously, visit anywhere in shadow by simply walking — or driving, or riding — there, slowly manipulating and adjusting the reality around them as they go until they arrive at just the place they were looking for. (The early books dwell for some time on the intriguing philosophical question of whether they are visiting lands that always existed in shadow or creating them in their mind’s eye; like much else, however, this question is forgotten in the later books, by which time Amber is conducting trade negotiations with lands in shadow.) Amber’s royal family, consisting of an inconveniently absent father along with nine brothers and four sisters, is riven with far more strife and suspicion than one might expect from a family supposedly representing Order. Upon their various plots rest most of the series’s most compelling plots.

The first five Amber books, later to become known as the “Corwin Cycle,” were published between 1970 and 1978. They tell of the struggles of Prince Corwin of Amber, first against his hated brother Eric for the throne and later against the forces of Chaos who threaten Amber and the very fabric of reality itself. The books proved to be very popular, by far the most popular thing Zelazny had ever written. And so he wrote another five books, the “Merlin Cycle” describing the adventures of Corwin’s son, between 1985 and 1991. Most critics will tell you that the series declines in quality almost linearly, a half-step or so at a time starting right from the second book. The first book, Nine Princes in Amber, while much more straightforwardly written and plotted than the likes of Lord of Light, breathes the old Zelazny magic as we learn about this grandly mysterious multiverse and are introduced one by one to the family of Amber and their Shakespearian intrigues and rivalries. But as the books go on with strangely little differentiation from one to another — it really does feel as if Zelazny would just write the story until he had the 225 pages that was his publisher’s ideal length, then stop for a while — it begins to feel like just a series of long, anecdotal meanderings, particularly by the time we get to the much inferior Merlin Cycle. It’s pretty clear after a certain point in the latter that he’s making it up as he goes along, and apparently forgetting in the process a good part of what he’s already written. As Amber turns from a magical perfection to a mundane place that doesn’t seem all that qualitatively different from any of the shadows, as characters reverse themselves or change personalities entirely to suit Zelazny’s newest plotting whims, as ultimately pointless digressions come to occupy entire books worth of story, the later books manage to retroactively spoil much of what came before. By the time the whole thing sputters to a halt with the most anti-climactic of endings in which Merlin does exactly what he spent the previous several books saying he didn’t want to do, much of the allure of Nine Princes in Amber has long since been ground into dust.

That, anyway, is my attitude today. I should note that 25 years ago when I first read the Amber books I thought they were magnificent, Corwin and even Merlin the most dashing and cool heroes imaginable. Now they seem as often as not like smug, smirking jerks who are nowhere near as clever as they think they are. Merlin in particular, I’ve gradually come to realize, is actually as dumb as a box of rocks; he spends most of his time like the player’s character in a videogame, being manipulated and led by the nose through his foreordained plot by other characters in the story. Still, Amber remains readable even at its worst, even when you know that none of this is really going anywhere in particular; Zelazny knew how to craft a page turner. My wife and I used my omnibus Chronicles of Amber as bedtime reading for several months. By the end we were spending a lot of time making fun of its endless, exhaustively detailed fight scenes, the occasional stabs at free-verse poetry that misfire horribly, the creepy Mary Sue quality to Corwin and Merlin (like them, we weren’t surprised to learn, Zelazny was a fencing aficionado, but presumably beautiful women didn’t all fall swooning before him the way they did for them), and the sheer stupidity of the hapless Merlin, but we did finish all ten books. I suppose that says something. Thomas M. Wagner summed up the Amber series about as charitably as one can on his reviews site: “There’s no point in pretending this is great literature any more than, say, Edgar Rice Burroughs, but it captures the quintessence of pulp escapism with just about as much purity. It’s fast-paced, gobs of fun, and requires about as many brain cells as an old Johnny Weismuller movie.” That should be good enough. Or it would be if Zelazny hadn’t proved himself capable of so much more. I’ll leave you to come down on whichever side you prefer.

Given its intriguing if not exactly rigorous fantasy milieu as well as the politicking that can make it seem like a fantastical version of Diplomacy, not to mention its considerable popularity at one time, Amber made a compelling setting for ludic narrative. In 1991, Erick Wujcik published the Amber Diceless Roleplaying Game, one of several streamlined tabletop RPG systems that appeared around that time with an emphasis on story and texture and, most of all, character interaction; this in contrast to older games like Dungeons and Dragons with their obsession with minutiae and tactical combat. Each player in the Amber Diceless Roleplaying Game takes the role of a member of the royal family. If everyone is in the proper Amber spirit, the gamemaster need not say much beyond that; the intrigues and betrayals all blossom naturally. Although it never gained the commercial prominence of fellow second-generation RPGs like White Wolf’s Vampire: The Masquerade, Amber attracted a cult of loyal players who still keep it alive today.

But long before The Diceless Roleplaying Game there was another ludic Amber, this one produced by Telarium for the computer. Like the simultaneously released Perry Mason game, Nine Princes in Amber appeared just as its source material was getting a boost in the form of new installments after a fallow period of some years. In the case of Amber, this material took the form of Trumps of Doom, the first volume in the Merlin Cycle and first Amber novel since the Corwin Cycle had concluded seven years before. Roger Zelazny was happy to cash Telarium’s checks, but otherwise contributed even less to the project than had Arthur C. Clarke and Ray Bradbury to their respective games. He did graciously sign his name to a suitable back-of-the-box blurb: “I’m thrilled to see my Amber books become a challenging computer adventure. For anyone interested in exploring contingent paths through my tale, the possibilities here are almost endless.” The actual game, however, is a product of the same committee approach that yielded Perry Mason.

As such things go, it’s at least a very relevant blurb. Like Perry Mason, Nine Princes in Amber is a crazily unusual and ambitious work of interactive fiction. There’s a modest slate of object-oriented puzzles to deal with as well as an elaborate and frustrating fencing simulation that has all the problems typical of randomized combat in text adventures. There’s also a graphics-based mini-game that is, unlike the horrid arcade sequences in earlier Telarium games, actually quite fun to play. Yet the main focus is once again on character interaction. The included verb list is even more far-ranging than that of Perry Mason, including some entrants that have quite possibly never featured in another work of interactive fiction before or since: verbs like “placate,” “flatter,” “mention,” “bluff,” and “stall.” The heart of the game is a series of tense encounters with your various siblings in which you’ll have the opportunity to try out those and many more.

That said, Nine Princes in Amber can at first seem underwhelming. The game seems to play out as a linear series of Reader’s Digest condensed scenes from the first two books, with most of the texture — like, inevitably, that provided by Corwin’s occasional amorous encounters — painfully absent. Do in any given scene what Corwin did in the book, and you get to continue to the next; do something else, and you get killed and see one of the “forty possible final endings” the box copy trumpets. As Jason Compton put it in a review on Lemon 64, gameplay can seem to devolve into, “All right, dammit, I know what Corwin did in the book, so how can I express it in terms the parser will understand?” In comparison to, say, Fahrenheit 451, which used its source novel as a springboard for something entirely new, this can seem depressingly unambitious, not to mention unchallenging for those who have read the books and impossible for those who haven’t.

But then, when you blunder your way at last to the end by trying to recreate the events of the novels as faithfully as possible, you get a shock: the ending you get is not a particularly good one. And so you begin to reexamine and reevaluate, and discover that Nine Princes in Amber is doing — or at at least trying to do — something very audacious. It really is possible to forge your own path through the story, to end up with a set of allies and enemies radically different from those the novel’s Corwin ended up with in his own quest for the kingship of Amber. The claim of forty endings may be a stretch, but it’s possible to reach and win the climactic battle and still see the story branch at least four ways depending on your actions earlier in the game and your relationships with your siblings.

While the Corwin of the novels eventually thinks better of his own ambition to be king, this remains the goal of the Corwin of the game. The game’s universe is even more amoral than that of the novels; not for nothing do you find a copy of The Prince in your sister Flora’s study early in the story. I found I could be most successful by going into full Harry Flashman mode, lying and backstabbing and wheedling my way through events.

There are several choke points through which the narrative will always funnel, whether the player is trying to diverge from the novel or follow its plot exactly. Veterans of the books will recognize them immediately: the Pattern walk in Rebma, the time in the dungeon of Amber, the encounter with Benedict near Avalon, the final battle at the foot of Mount Kolvir. In between, the narrative can branch off in many directions. (This certain amount of linearity is necessary not least because the game is distributed on four disk sides for the Apple II and Commodore 64; the amount of disk flipping required would otherwise be horrendous.) Impressively, the reasons you arrive at the various choke points can be very different, and the relationships you’ve built or failed to build are preserved as you pass through them. In this sense of making all the pieces fit while preserving the player’s freedom, Nine Princes in Amber is one hell of an intricate piece of design.

Indeed, the game is in its way an amazing achievement. I know of no other text adventure from its era — and, come to think of it, possibly of any other — that offers this level of choice over not just the beats of the story or the order in which puzzles are solved but of the very direction of such a grand narrative. Yet it’s also often a pain to play, thanks as usual to that problematic Telarium parser. It’s nice that the game offers verbs like “placate,” but most of the time, even in conversations, most of these clever verbs do nothing; worse, it’s often hard to figure out whether any given verb is doing anything or not. Nine Princes in Amber has, in other words, all of the same problems as Perry Mason. If anything, they’re even more pronounced here.

After thinking about it a bit, I began to feel that even if its parser was much better something would still be off about the game. Many commands that do work are absurdly wide in scope and open to interpretation, sometimes causing hours or weeks to pass in the story: “walk in shadow,” “go to Brand,” “attack Amber.” Then it struck me: Nine Princes in Amber is really a choice-based narrative that’s been saddled with the wrong interface. Parsers are very good for complex but granular manipulations. Parser-based games are excellent tools for exploring geographical spaces and manipulating their contents, but not so good for exploring story spaces, for manipulating the narrative itself as does the player of Nine Princes in Amber. As Sam Kabo Ashwell wrote in his great series of articles about Choose Your Own Adventure books and other gamebooks (many of a vintage similar to this game), “CYOA is where you go when you want to prioritise free-flowing, bigger-scale narrative over deep or difficult interaction.” These are indeed the priorities of Nine Princes in Amber. The parser in this context only obfuscates what should be a delightful garden of forking paths. It leaves you constantly poking at unrewarding blind alleys that don’t work simply because that’s not one of the ways the plot is allowed to branch right now.

But imagine Nine Princes in Amber as a hypertext narrative with some limited state tracking and it all falls into place. One could create a node diagram like those Ashwell created for his articles if one was willing to spend enough time plumbing the game’s depths. This isn’t the first time I’ve observed such a disconnect between interface and content; I once went so far as to re-implement one of Robert Lafore’s pioneering experiments in ludic narrative as a choice-based game to prove a similar point. I won’t do the same here, although it is tempting; copyright concerns as well as the vastly greater complexity of the Telarium game prevent me. You’ll have to accept my word that this game would work perfectly well in any of the several viable modern hypertext-narrative engines.

So, chalk up Nine Princes in Amber as — stop me if you’ve heard this before — one more noble Telarium experiment that doesn’t really work as a playable game. Still, like Perry Mason, it’s worth some of your time just to marvel at its ambitions. Failures are after all often more instructive than successes. To experience Nine Princes in Amber, an interesting blend of both, feel free to download the Commodore 64 version here.

 

Tags: , ,

Wishbringer

Brian Moriarty, 1985

Brian Moriarty, 1985

Brian Moriarty was the first of a second wave of Infocom authors from very different and more diverse backgrounds than the original Imps. Their fresh perspectives would be a welcome addition during the latter half of the company’s history. Some of the second wave all but stumbled through the doors of Infocom, but not Moriarty — not at all Moriarty. His arrival as an Imp in September of 1984 marked the fruition of a calculated “assault on Infocom” — his words, not mine — that had taken over two years to bring off.

Moriarty’s personal history is perfect for an Imp, being marked by a mix of technical and literary interests right from his grade-school years. After taking a degree in English Literature from Southeastern Massachusetts University in 1978, he found a job in a Radio Shack store, where he spent hours many days playing with the TRS-80s. He didn’t buy a computer of his own, however, until after he had become a technical writer at Bose Corporation in Framingham, Massachusetts. It was there in 1981 that a colleague brought in his new Atari 800 to show off. Moriarty succumbed to the greatest Atari marketing weapon ever devised: the classic game Star Raiders. He soon headed out to buy an Atari system of his own.

Along with the computer and Star Raiders, Moriarty also brought home a copy of Scott Adams’s Strange Odyssey. He played it and the other Scott Adams games obsessively, thinking all the while of all the ways they could be better. Then one day he spotted Infocom’s Deadline on the shelf of his local Atari dealer. From its dossier-like packaging to its remarkable parser and its comparative reams of luxurious text, it did pretty much everything he had been dreaming about. Moriarty knew in an instant what he wanted to do, and where he wanted to do it. How great to learn that Infocom was located right there in the Boston area; that, anyway, was one problem less to deal with. Still, Infocom was a tiny, insular company at this point, and weren’t exactly accepting resumes from eager Atari enthusiasts who’d never designed an actual game before.

So Moriarty put Infocom in his long-range planning folder and went for the time being somewhere almost as cool. Back at Radio Shack, he’d worked with a fellow named Lee Pappas, whom he’d been surprised to rediscover behind the counter of the local Atari dealer when he’d gone to buy his 800 system. Pappas and a friend had by then already started a little newsletter, A.N.A.L.O.G. (“Atari Newsletter and Lots of Games”). By the end of 1982 it had turned into a full-fledged glossy magazine. Pappas asked Moriarty, who’d already been a regular contributor for some months, if he’d like to come work full-time for him. Moriarty said yes, leaving his safe, comfortable job at Bose behind; it was “the best career move I ever made.”

A.N.A.L.O.G. was a special place, a beloved institution within and chronicler of the Atari 8-bit community in much the same way that Softalk was of the Apple II scene. Their articles were just a little bit more thoughtful, their type-in programs a little bit better, their reviews a little bit more honest than was the norm at other magazines. Moriarty, a graceful writer as well as a superb Atari hacker, contributed to all those aspects by writing articles and reviews and programs. Life there was pretty good: “It was a small group of nerdy guys in their 20s who loved computer games, ate the same junk foods, and went to see the same science-fiction movies together.”

Still, Moriarty didn’t forget his ultimate goal. Having advanced one step by getting himself employed in the same general industry as Infocom, he set about writing his first adventure game to prove his mettle to anyone — Infocom, perhaps? — who might be paying attention. Adventure in the Fifth Dimension appeared in A.N.A.L.O.G.‘s April/May 1983 issue. A necessarily primitive effort written mostly in BASIC and running in 16 K, it nevertheless demonstrated some traits of Moriarty’s later work by mixing a real place, Washington D.C., with fantastic and surreal elements: a group of aliens have stolen the Declaration of Independence, and it’s up to you to track down an entrance to their alternate universe and get it back. A year later, Moriarty continued his campaign with another, more refined adventure written entirely in assembly language. Crash Dive! pits the player against a mutineer aboard a nuclear submarine, a scenario much more complex and plot-heavy than the typical magazine-type-in treasure hunt. It even included a set of Infocom-style feelies, albeit only via a photograph in the magazine.

Crash Dive!'s "feelies"

With two games under his belt, Moriarty applied for a position as a game designer at Infocom, but his resume came right back to him. Then a colleague showed him a posting he’d spotted on the online service CompuServe. It was from Dan Horn, manager of Infocom’s Micro Group, looking for an expert 6502 hacker to work on Z-Machine interpreters. It took Moriarty about “45 seconds” to answer. Horn liked what he saw of Moriarty, and in early 1984 the latter started working for the former in the building where the magic happened. His first project involved, as chance would have it, another submarine-themed game: he modified the Atari 8-bit, Commodore 64, and Apple II interpreters to support the sonar display in Seastalker. Later he wrote complete new interpreters for the Radio Shack Color Computer and the ill-fated Commodore Plus/4.

He was tantalizingly close to his goal. Having broken through the outer gates, he just needed to find a way into the inner keep of the Imps themselves. He took to telling Berlyn, Blank, Lebling, and the rest about his ambition every chance he got, while also sharing with them his big idea for a game: a grand “historical fantasy” that would deal with no less weighty a subject than the history of atomic weapons and their implications for humanity. It seemed the perfect subject for the zeitgeist of 1984, when the Cold War was going through its last really dangerous phase and millions of schoolchildren were still walking around with souls seared by the previous year’s broadcast of The Day After.

Moriarty got his shot at the inner circle when a certain pop-science writer whom Infocom had hired to write a game was allegedly found curled up beneath his desk in a little ball of misery, undone by the thorny syntax of ZIL. This moment marks the end of Marc Blank’s dream of being able to hire professional writers off the street, set them down with a terminal and a stack of manuals, and wait for the games to come gushing forth. From now on the games would be written by people already immersed in Infocom’s technology; the few outside collaborations to come would be just that, collaborations, with established programmers inside Infocom doing the actual coding.

That new philosophy was great news for a fellow like Brian Moriarty, skilled coder that he was. The Imps decided to reward his persistence and passion and give him a shot. Only thing was, they weren’t so sure about the big historical fantasy, at least not for a first game. What they really had in mind was a made-to-order game to fill a glaring gap in their product matrix: a gentle, modestly sized game to introduce newcomers to interactive fiction — an “Introductory”-level work. And it should preferably be a Zorkian fantasy, because that’s what sold best and what most people still thought of when they thought of Infocom. None of the current Imps were all that excited about such a project. Would Moriarty be interested? He wasn’t about to split hairs over theme or genre or anything else after dreaming of reaching this point for so long; he answered with a resounding “Absolutely!” And so Brian Moriarty became an Imp at last — to no small consternation from Dan Horn, who’d thought Moriarty had come to Infocom to do “great work for me.”

It’s kind of surprising that it took Infocom this long to perceive the need for a game like the one that Moriarty would now be taking on as his first assignment. Their original matrix had offered only games for children — “Interactive Fiction Junior” — below the “Standard” level. Considering that even the hard-as-nails Hitchhiker’s Guide to the Galaxy was labelled “Standard,” the leap from “Junior” to “Standard” could be a daunting one indeed. Clearly there was room for a work more suitable for adult novices, one that didn’t condescend in the way that Seastalker, solid as it is on its own terms, might be perceived to do. Infocom had now decided to make just such a game at last — although, oddly, the problematic conflations continued. Rather than simply add a fifth difficulty level to the matrix, they decided to dispense with the “Junior” category entirely, relabeling Seastalker an “Introductory” game. This might have made existing print materials easier to modify, but it lost track entirely of Seastalker‘s original target demographic. Infocom claimed in The New Zork Times that “adults didn’t want a kid’s game; in fact, kids didn’t want a kid’s game.” Which rather belied the claim in the same article that Seastalker had been a “success,” but there you go.

Moriarty was a thoughtful guy with a bit of a bookish demeanor, so much so that his inevitable nickname of “Professor” actually suited him really well. Now he started thinking about how he could make an introductory game that wouldn’t be too condescending or trivial to the Infocom faithful who would hopefully also buy it. He soon hit upon the idea of including a magic MacGuffin which would allow alternate, simpler solutions to many puzzles at a cost to the score — literally a Wishbringer. The hardcore could eschew its use from the start and have a pretty satisfying experience; beginners could, after the satisfaction and affirmation of solving the game the easy way, go back and play again the hard way to try to get a better score. It was brilliant, as was the choice not to make using the Wishbringer just a “solve this puzzle” button but rather an intriguing little puzzle nexus in its own right. First the player would have to find it; then she would have to apply it correctly by wishing for “rain,” “advice,” “flight,” “darkness,” “foresight,” “luck,” or “freedom” whilst having the proper material components for the spell on hand, a perfect primer for the spellcasting system in the Enchanter trilogy. The wishes would, like in any good fairy tale, be limited to one of each type. So, even this route to victory would be easier but still in its own way a challenge.

At first Moriarty thought of making Wishbringer a magic ring, but what with The Lord of the Rings and a thousand knock-offs thereof that felt too clichéd. Anyway, he wanted to include it in the box as a feelie, and, cost concerns being what they were, that meant the ring would have to be a gaudy plastic thing like those ones bubble-gum machines sometimes dispensed in lieu of a gumball. Then he hit upon the idea of making Wishbringer a stone — “The Magick Stone of Dreams.” Maybe they could make the one in the package glow in the dark to give it that proper aura and distract from its plasticness? Marketing said it was feasible, and so the die (or stone) was cast. Thus did Wishbringer become the first and only Infocom game to be literally designed around a feelie. Moriarty spent some nine months — amidst all of the Hitchhiker’s and Cornerstone excitement, the high-water mark that was Christmas 1984, an office move, and the dawning of the realization that the company was suddenly in big, big trouble — learning the vagaries of ZIL and writing Wishbringer.

Wishbringer

For all that it’s a much subtler work lacking the “Gee whiz!” quality of Seastalker, Wishbringer does feel like a classic piece of children’s literature. It casts you as a postal carrier in the quietly idyllic village of Festeron, which is apparently located in the same world as Zork and shares with that series an anachronistic mixing of modernity with fantasy. (I’m sure someone has figured out a detailed historical timeline for Wishbringer‘s relation to Zork as well as geography and all the rest, but as usual with that sort of thing I just can’t be bothered.) You dream of adventure — in fact, you’re interrupted in the middle of such a daydream as the game begins — but you’re just a mail carrier with a demanding boss. Said boss, Mr. Crisp, gives you a letter to deliver to the old woman who is proprietor of Ye Olde Magick Shoppe up in the hills north of town. On your way there you should explore the town and enjoy the lovely scenery, because once you make the delivery everything changes. The letter turns out to be a ransom note for the old woman from “The Evil One,” demanding Wishbringer itself in return for the safe return of her cat: “And now, now it claims my only companion.”

"It's getting Dark outside," the old woman remarks, and you can almost hear the capital D. "Maybe you should be getting back to town."

The old woman hobbles over to the Magick Shoppe door and opens it. A concealed bell tinkles merrily.

"Keep a sharp eye out for my cat, won't you?" She speaks the words slowly and distinctly. "Bring her to me if you find her. She's black as night from head to tail, except for one little white spot... right HERE."

The old woman touches the middle of your forehead with her finger. The light outside dims suddenly, like a cloud passing over the sun.

So, Wishbringer is ultimately just a hunt for a lost cat, a quest I can heartily get behind. But as soon as you step outside you realize that everything has changed. The scenery becomes a darker, more surreal riot reminiscent in places of Mindwheel. Mailboxes have become sentient (and sometimes carnivorous); Mr. Crisp has turned into the town’s petty dictator; a pet poodle has turned into a vicious hellhound. The game flirts with vaguely fascistic imagery, as with the giant Boot Patrols that march around the town enforcing its nightly curfew. (This does lead to one glaring continuity flaw: why is the cinema still open if the whole city is under curfew?) There’s a creepy dread and a creepy allure to exploring the changed town, a reminder that, as the Brothers Grimm taught us long ago, ostensible children’s literature doesn’t necessarily mean all sunshine and lollypops.

Like so much of Roberta Williams’s work, Wishbringer plays with fairy-tale tropes. But Moriarty is a much better, more original writer than Williams, not to mention a more controlled one. (Witness the way that the opening text of Wishbringer foreshadows the climax, a literary technique unlikely to even occur to Williams.) Rather than appropriate characters and situations whole cloth, he nails the feeling, balancing sweetness and whimsy with an undercurrent of darkness and menace that soon becomes an overcurrent when day turns to night and the big Change happens. The closest analogue I can offer for the world of Wishbringer is indeed the Brothers Grimm — but perhaps also, crazy as this is going to sound, Mr. Rogers’s Neighborhood of Make-Believe. Wishbringer has that same mixing of playfulness with a certain gravitas. There’s even some talking platypuses, one of very few examples of direct borrowing from Moriarty’s inspirations.

The other examples almost all come from Zork, including a great cameo from the good old white house and mailbox. And of course every Zork game has to have grues somewhere. The grues’ refrigerator light is my favorite gag in the whole game; it still makes me chuckle every time I think about it.

You have stumbled into the nesting place of a family of grues. Congratulations. Few indeed are the adventurers who have entered a grue's nest and lived as long as you have.

Everything is littered with rusty swords of elvish workmanship, piles of bones and other debris. A closed refrigerator stands in one corner of the nest, and something... a small, dangerous-looking little beast... is curled up in the other corner.

The only exit is to the west. Hope you survive long enough to use it.

 

Snoring fitfully, the little beast turns away from the light of the small stone and faces the wall.

>open refrigerator
A light inside the refrigerator goes out as you open it.

Opening the refrigerator reveals a bottle and an earthworm.

The little beast is stirring restlessly. It looks as if it's about to wake up!

>close refrigerator
A light inside the refrigerator comes on as you close it.

Indeed, while Moriarty is generally thought of as Infocom’s “serious” author on the exclusive basis of his second game Trinity, Wishbringer is full of such funny bits.

Wishbringer is very solvable, but doing so is not trivial even if you let yourself use the stone; this is of course just as Moriarty intended it. You may not even find the stone until a good third or more of the way through the game, and it definitely won’t help you with everything thereafter. Played without using the stone, I’m not sure that Wishbringer is really all that much easier than the average mid-period Infocom game at all. The most objectionable aspects for the modern player as well as the most surprising to find in an “Introductory” game are the hard time limits; you’re almost certain to need to restart a few times to fully explore Festeron before the Change and still deliver the letter in time, and you may need a few restores to get everything you need to done after the Change. An inventory limit also sometimes complicates matters; Infocom had been slowly losing interest in this sort of purely logistical problem for years, but Wishbringer demonstrates that even in an introductory game they weren’t quite there yet. Still, those are design sins worth forgiving in light of Wishbringer‘s charms — assuming you think them sins at all. Like the determination to make you work a bit for a solution even if you use the stone, they could be seen as a good thing. Wishbringer, we should remember, was meant to serve as an introduction to Infocom’s catalog as a whole, in which players would find plenty of other timers and inventory limits and puzzles that refuse to just disappear in a poof of magic. Wishbringer‘s refusal to trivialize its purpose is really quite admirable; there’s even a (thankfully painless) pseudo-maze.

Wishbringer was released in June of 1985, eight full months after Infocom’s previous game Suspect. That gap would turn out to be by far the longest of Infocom’s productive middle years, and had left many fans worried about the company’s future and whether Cornerstone meant the end of games. Infocom’s idea that there were people potentially interested in interactive fiction but eager for a gentler version of the form turned out to be correct. Wishbringer turned into one of Infocom’s last genuine hits; Billboard software charts from the second half of 1985 show it and Hitchhiker’s regularly ensconced together inside the Top 20 or even Top 10, marking the last time Infocom would have a significant presence there. It sold almost 75,000 copies in its first six months, with a lifetime total perhaps as high as 150,000. To the best of my reckoning it stands as about Infocom’s fifth best-selling game overall.

Sales figures aside, Wishbringer‘s “Introductory” tag and its gentle, unassuming personality can make it an easy game amongst the Infocom canon to dismiss or overlook. That would be a shame to do, however; it’s one of the most likeable games Infocom ever did. While not one of Infocom’s more thematically or formally groundbreaking games and thus not one of their more discussed, it continues to be enjoyed by just about everyone who plays it. It’s the sort of game that may not come up that often when you ask people about their very favorites from Infocom, but mention it to any Infocom fan and you’ll almost always get back an “Oh, yes. I really liked that one.” Rather than bury its light charm under yet more leaden pontification, I’ll just suggest you play it if you haven’t already.

(Jason Scott’s interviews for Get Lamp informed much of this article. Interviews with Moriarty of various vintages can be found online at The IF Archive, 8bitfiles.net, Adventura CIA, Electron Dance, and Halcyon Days. Also useful was Moriarty’s “self-interview” in the January/February 1986 AmigaWorld; his picture above comes from that article. Adventure in the Fifth Dimension was published in the April/May 1983 A.N.A.L.O.G.; Crash Dive! in the May 1984 A.N.A.L.O.G., the last to which Moriarty contributed.)

 
 

Tags: , ,

Mindwheel (or, The Poet and the Hackers)

Mindwheel

Excepting only Adventure and a handful of works by Infocom, Robert Pinsky’s Mindwheel has received far more academic attention than any other work of interactive fiction’s commercial era. If you’re of a practical — not to say cynical — turn, you can posit a pretty good theory as to why that should be without ever looking to the game itself. Pinsky, you see, is by far the most respectable and respected literary figure ever to turn his hand to the humble text adventure. His resume is impressive to say the least: United States Poet Laureate from 1997 to 2000; author of nineteen books, nine of them full of poems; translator of Dante; professor of literature at Berkeley and Boston University amongst other places; editor of literary magazines and anthologies; scholar of the Biblical David and Shakespeare. For any graduate student looking to justify a thesis or article about interactive fiction, Pinsky is a riposte to die for when colleagues and advisers ask whether text adventures are really all that significant as literary works. If they were good enough for Pinsky, they should be good enough for anyone.

Mindwheel is the product of a strange historical moment; it’s hard to imagine it appearing more than a year before or after its February 1985 release date. This was the era of bookware, when interactive fiction was seen as the future of the book and the future of computerized entertainment all rolled into one; when action games were seen as relics of the recently passed age of the Atari VCS; when a company called Synapse Software, known already as the makers of some of the slickest and most graphically impressive action games on the Atari 8-bit line, could decide to stake much of their future on textual interactive fiction not out of some suicidal artistic impulse but because doing so seemed a perfectly reasonable commercial calculation. Strange, strange times.

Ihor Wolosenko

Ihor Wolosenko

The story of Synapse Software is largely the story of Ihor Wolosenko, whose family had immigrated to the United States from Ukraine when he was still a toddler and who had filled the nearly forty years that elapsed in his life before Synapse with a bewildering array of activities and avocations. He had studied drama at the City University of New York; been a professional photographer; worked as a physical therapist; counseled and conducted personal workshops using a combination of Tibetan Buddhism and the controversial branch of psychology known as neuro-linguistic programming; delved deeply into linguistics and hypnosis. By 1980, the year he bought an Atari 800, he had ended up like so many other drifting dreamers in Berkeley, California. He chose the Atari because it could play Star Raiders and the Apple II couldn’t.

Wolosenko soon made a more technical friend, a vice president in charge of data processing at the San Francisco Federal Reserve Bank named Ken Grant who had been toying with an Atari 800 database application in his spare time. The two worked on it together for nearly a year, then founded Synapse out of Wolosenko’s apartment to release it in August of 1981. It wasn’t an auspicious start; the first hundred or so copies of FileManager 800 that they shipped were so buggy that they had to recall the whole production run. But by the end of the year Synapse was truly up and running at last, with not just FileManager but a game or two as well.

Wolosenko was already putting together the team of crack programmers whose games would make Synapse’s reputation. Games like Shamus, Blue Max, The Pharaoh’s Curse, and their most beloved title of all Alley Cat mixed superb graphics with addictive playability and a welcome sense of whimsy. Little extra touches distinguished Synapse’s games from the competition. In Alley Cat, for instance, if you don’t do anything for a few seconds your avatar will start to move around on his own and meow impatiently to you, decades before such “juicy” touches would become a widely accepted requirement for casual games.

It wouldn’t be out of line to compare Synapse’s mystique in North America with that of Ultimate Play the Game in Britain. Both developed all of their games in-house, insuring that they all shared a similar look and design sensibility. Both were absolute masters of their chosen platforms (the Spectrum for Ultimate, the Atari 8-bits for Synapse) and consistently delivered games that were far slicker than virtually anything the competition had to offer. Synapse, like Ultimate, did write for other platforms, but their core competency and core loyalty remained with the Atari machines. Atari users in turn loved them. Because Synapse’s games were born on Ataris, they could take full advantage of the best graphics and sound in the industry, capabilities matched only (and if you listen to Atari loyalists only arguably) by those of the Commodore 64.

While Wolosenko usually refused formal credit on his programmers’ designs, much of the character of Synapse’s games was down to him. His company may have been making relatively simple action games, but he nevertheless thought seriously about the nature of the medium, the relationship between player and avatar, the standard approach of graduated difficulty levels (bad) and the alternative of adaptive gameplay (good). He shepherded every game and every programmer through the process of development, giving a little nudge here, a little tweak there to make the end result that much better. Synapse programmer Steve Hales called Wolosenko the Steve Jobs of games: “Every product that Synapse produced had Ihor’s touch. I believe that because of Ihor our quality was better, the designs were more unique, and I was pushed beyond what I thought was possible.”

According to Hales, it was he and another of Wolosenko’s favorite programmers, William Mataga, who planted the idea of doing adventure games in Wolosenko’s head in late 1983. (William Mataga now lives as Cathryn Mataga. I refer to her by her previous name and gender in this article only to avoid historical anachronisms.) Hales and Mataga believed that Infocom had “old technology,” and Synapse could do better. Wolosenko didn’t take much convincing. Showing his usual enthusiasm, he laid out an ambitious if not entirely cogent manifesto for Synapse’s engine, which would be the work largely of Mataga.

The problem with these adventure games thus far, even the more interactive ones, is that you have the feeling of being in a corral. You go this way and someone says, “You can’t go that way.” If I say, “Toss something,” and it says, “I don’t understand that word,” when it just used that word in a description it drives me up the wall. It totally stops the experience for me. We’re going to have to work with some of those obstructions until we can solve some of the problems: not processing time, just putting the computer’s power to better use.

The most intricate puzzle is not a Rubic’s Cube, it’s a person. And it’s a character that changes. When you read bad fiction, the character comes in, he interacts with a lot of people, and he goes out exactly the way he came in. When you read a Tolstoy novel, the character is totally different at the end of the novel than when he came in at the beginning. And that’s what we’re trying to do. There is no reason why you have to be the same person during a game either. You could have a changeling-type game, where you’re a person at one point, you’re a dog at another, a bat at another.

Mataga dubbed his system BTZ — “Better than Zork” — to keep the end goal inescapable for everyone. Crucially, the vision was for pure text from the outset. Whereas rivals like Telarium sought to one-up Infocom by adding graphics and sound and even occasional action games to the mix to hopefully distract from their less than Infocom-quality parsers, prose, and world models, Synapse would go against them head to head, strength against strength. The games themselves Wolosenko first wanted to call “Microworlds” in light of the freedom and sense of realism they would offer. That soon changed, however, when he had his next brain storm: to hire the best outside writers he could find — real writers — to craft the worlds and write the text. His Microworlds thus became Electronic Novels.

There is some evidence that the poet Robert Pinsky was far from Wolosenko’s first choice to craft the first Electronic Novel. In an interview published in the February 1984 issue of Ahoy! magazine, he claimed that, while the contracts were not yet all signed, Synapse hoped to be employing the services of “top, top novelists [emphasis mine].” But Telarium and many others, some with pockets and connections much deeper than Synapse’s, were already trolling these waters. Wolosenko apparently soon decided that, if he couldn’t sign “top” writers in terms of sales and commercial appeal, he could hire the most prestigious, thereby underscoring the literary credibility of Synapse’s line. Somehow he jumped to the inspired choice of targeting not novelists but poets; perhaps he figured that, what with the term “popular poet” having been largely an oxymoron for decades already, they’d be more likely to jump at the chance for any sort of recognition. Surveying the possibilities, he came across the name of Robert Pinsky, who was teaching at UC Berkeley and thus an easy mark logistically. The resume of Pinsky, then about the same age as Wolosenko, was nowhere near as impressive as it is today, but he nevertheless had a burgeoning literary reputation, with two well-received books of poetry already published and a third in the galley stage. (Wolosenko would soon also tap another respected young poet, Jim Paul, for another game in the line.)

Robert Pinsky

Robert Pinsky

One day as Pinsky was sitting in his office in Berkeley’s English department having spent the last several hours dealing with some of the more tedious administrative details that come with being a professor, his phone rang. It was Ihor Wolosenko on the line.

He said, “Are you familiar with computer text adventures?”

I said, “No.”

He asked whether I owned a computer.

I said, “No.”

Had I ever heard of Zork?

“No.”

Would I be interested in writing the text for an interactive computer work?

I said, “Yes, I might be.”

Pinsky drove out to visit Synapse’s offices. Wolosenko introduced him to some of his programmers and also to the concept of text adventures.

I liked it. My romantic idea was that it was like those first guys figuring out what movies were going to be on Long Island — playing with movie cameras. I didn’t see any reason that you couldn’t make a work of art. Art is alternate realities — realities that are in some ways like the realty we experience and in some ways quite unlike it. This was that. And it was clear to me from my small experience of adventures — the description of Zork, the stuff I saw on those monochrome monitors — that this was largely about the quest plot, one of the basic plots of great works. The Gilgamesh epic is a quest for the nature of immortality — or the nature of death, the nature of mortality. “KILL DWARF,” “GET SWORD,” etc., was completely in that line. Indeed, the imagery was very traditional.

It was agreed that Pinsky would come up with five or six ideas for possible games. Then Synapse would decide which one might be the most intriguing and realizable. The one that Pinsky himself considered the “silliest” sent the player on a journey through four minds: an assassinated rock star with a messiah complex, clearly modeled on John Lennon; a bloody dictator inspired by Hitler and Stalin and the rest of the twentieth century’s sad litany; a brilliant scientist reminiscent of Marie Curie; and a poet, a nod to the game’s creator himself. Much to Pinsky’s surprise, this treatment was the one that Wolosenko and company opted for.

One of the loveliest aspects of the Mindwheel project is the genuinely warm, respectful relationship that developed between Pinsky and the young hackers at Synapse, these men who normally inhabited what might as well have been separate planets. Pinsky worked most closely with Steve Hales, who did the actual coding for the game in Mataga’s BTZ language. Hales, who had never voluntarily read a line of verse in his life, slowly discovered through the soft-spoken, thoughtful Pinsky a new respect for the written word and the power of literature: “He changed the way I read and write words forever.” For his part, Pinsky found the youthful can-do spirit at Synapse a relief from the “oppressive” corridors of academia; he was soon “making up excuses” to visit Synapse and “hang out.” Hales endeared himself to Pinsky from his first words: “I’d like to talk to you about your world,” a turn of phrase Pinsky found almost inexpressibly fresh and exciting. He took to using — and often charmingly misusing — the fascinating jargon, a delight to his poet’s soul, that was always flying through the air at Synapse. He accepted what he wryly refers to as his “assignments” from Hales and company with cheerful equanimity: write a “dialog table” for a given character for queries involving a given set of topics; write responses in which each of these fifty verbs is used successfully and unsuccessfully. The terms attached to even the framework of the game took a poetic turn under Pinsky’s influence, with “drivel” coming to mean amusing incidental messages that were essentially random, not germane to the plot or puzzles, and “weather” those that were.

While the experience of actually developing Mindwheel was by everyone’s account an almost entirely positive one, its story is also one of crossed purposes between Pinsky and Wolosenko. Wolosenko clearly wanted to create a work of art that transcended the notion of a mere computer game. Thus the involvement of Pinsky in the first place, as well as the term “Computer Novel” and his plan to package each title in the line inside a hardcover book of at least a hundred pages. (This latter was also, of course, a challenge to Infocom’s superb packaging, yet another reflection of a determination to do “everything that Infocom does, plus one.”) Pinsky, meanwhile, took the project as a chance to let his hair down and maybe reach the sort of popular readership that had inevitably eluded him thus far despite his stellar reputation inside the ivory tower. He was teaching a class about Shakespeare at the time, and thinking a lot about how the Bard had become the greatest writer in the history of the English language not by appealing to the highbrows but by writing popular entertainments for the masses. (Pinsky still remains admirably free of literary snobbery today, listing for example South Park as one of the “tremendous works of our time,” its creators amongst our “leading moralizers.”)

The idea of making the package for Mindwheel into a hardcover book was very much Ihor Wolosenko’s idea. I didn’t like it; I resisted it. I happened to refer to what we were doing as “the game.” To me, that was fresh and exciting. The guys at Synapse who were promoting it wanted to call it an “Electronic Novel,” because from their viewpoint that was fresh and interesting.

I was disappointed that the package would be a book. They wanted me to write the stuff for the book. I declined. It was produced by committee; I wounded up sort of editing it. The book was the least interesting part for me. I’ve written books; I’ve published lots of books; I wasn’t particularly excited by the romance of having a book. Ihor’s marketing idea was that this would be somehow “highbrow.” I liked the idea that it was an entertainment, that it was a game. I wanted to get away from the “literary” genre. I wanted to write a really exciting, artistic game.

Pinsky noted in a contemporary interview that he didn’t particularly care if Mindwheel got a writeup in The Partisan Review because his name had already appeared there many times. Wolosenko, of course, would have killed for such a marker of literary status.

The book, which is credited to BTZ project manager Richard Stanford, is a rather labored piece; it’s quite clear that Synapse struggled to come up with material to fill its pages, resorting to leaving dozens of pages entirely blank in the name of an “Adventurer’s Diary” for note taking. Those pages which are filled strain to set up a believable science-fictional reason for the mind-delving you do in the game proper. It seems that the social order on Earth is about to collapse thanks to humankind’s ongoing irresponsibility and the sheer inertia of thousands of years of petty human history. The only hope for salvation rests, for reasons poorly defined at best, in the science of “neuro-electronic matrix research” (the terminological similarly to Wolosenko’s personal interest of neuro-linguistic programming is interesting), which will allow a traveler to visit “four minds of unusual power” whose echoes still persist in the very atmosphere — shades of Carl Jung’s ideas about a collective unconscious. The four minds will eventually lead you to the “Cave Master,” “the mysterious prehistoric, apelike being who apparently invented the lever, the flint blade, cave paintings, and the rhythmical group chant” and who holds the “Wheel of Wisdom” that can save humankind. The winning passage of Mindwheel, after the Wheel has been retrieved, indicates about how seriously Pinsky took this earnest frame.

"This formula," says Virgil through happy tears, "can disable every weapon of mass destruction on the planet! And that is only the first benefit. Your courage and brains have given us a glorious new chance!

"Already, the planet's magnetic field is changed, so that any politician who lies on television will be afflicted with instant, debilitating diarrhea, and immediate, spectacular skin blemishes!"

He beams and detaches your electrodes.

Exalted but a little drained, you wish only to rest a while, and then unwind, maybe by playing some harmless game.

No, Mindwheel is more electronic poem than electronic novel. The world of the four minds is a surrealistic, impressionistic riot of emotional imagery. The premise and that very description raise immediate warning flags to a jaded old IFer like me; the history of amateur interactive fiction is strewn with surrealistic explorations of the inner consciousness, generally from younger writers with a wide streak of overwrought self-indulgence. They’re almost uniformly awful. But — to state the obvious — the authors of these works are (presumably) not future Poet Laureates. Pinsky’s prose is bracing, his imagery consistently surprising and consistently as right as it is bizarre. To play Mindwheel is an overwhelming sensory experience — even as all of its sensations are evoked through pure text.

The Concert

The first mind you enter is that of Bobby Clemons, the rock star.

You stand on an immense stage. In front of you, a crowd roars like thunder. Someone has thrown a rose and a Baby Ruth candy bar onto the stage. High overhead, a huge video screen displays, over and over, the film of Bobby Clemon's assassination. In tight, sequined costumes, a chorus of singers writhes, imitating the gestures of the fatally wounded figure on the screen.

A ramp juts south into the crowd that pleads for you to come forward. A keyboard is on the east part of the stage, while to the west, some thugs seem about to overpower your bodyguard. They have clubs, and you hold only your harmonica; your pockets are empty. While the crowd screams for more, one of the singers beckons you to come offstage by the door northward behind you.

The scene is vaguely hilarious and vaguely disturbing. As you stalk the stage panties are flying, dancers are grinding, bodyguards and thugs are brawling, and the crowd is baying for your love or your blood, or more likely both. It’s rock and roll in all its Dionysian danger and splendor. The other minds are only slightly less crowded and just as evocative: the poet’s full of more wistful imagery of sex and love and life and death; the dictator’s, a barren, ugly place of stunted growth and pathetic posturing; the scientist’s, an immense chess board of cool, classical beauty.

The obvious literary antecedent of the whole endeavor is Dante’s The Divine Comedy, particularly its first part The Inferno. Pinsky makes his homage about as explicit as homages can be by naming the scientist who sends you on your journey into the minds Doctor Virgil, a reference to the Roman poet who served as Dante’s guide to humanity in all its facets. Other more subtle references are sprinkled throughout Mindwheel. More importantly, the feel of the environment is similar. Dante has been a long-term fixation of Pinsky, resulting most notably in the popular translation of The Inferno which he published a decade after Mindwheel, and which has led Nick Montfort to cheekily note Mindwheel as “the first work of interactive fiction to have influenced The Inferno.”

Like The Divine Comedy, Mindwheel manages to be personal as well as epic. Amidst all the other imagery you’ll find within it a brief homage to Pinsky’s early mentor, the iconoclastic poet Yvor Winters, as well as a more extended one to the Brooklyn Dodgers of the 1950s, those “boys of summer” who are the subject of the best book ever written about baseball. Indeed, the final puzzle of the game is a technically unfair one which requires you to do a bit of outside research into the only Brooklyn team to win the World Series. But go ahead and do the research; it’s good for you, and it’s trivial in the age of the Internet. Pinsky, who grew up in neighboring New Jersey, obviously followed the Brooklyn Bums and loved them dearly, obviously was as heartbroken as the rest of their fans when the team upped and moved to Los Angeles.

But the most personal of all parts of Mindwheel is, as you might expect, your excursion into the mind of the poet. Pinsky has since noted that one of the few sources of occasional tension between him and Hales stemmed from the former’s desire to just keep piling on more crazy world to explore while the latter insisted that there needed to be puzzles, pacing, the structure that would result in a real game with a score of sorts — presented as a summarized list of your achievements rather than a numerical value — and the possibility for victory. (Yes, this would seem to suddenly put Pinsky and Synapse on the opposite sides of the positions they had already staked in the novel/game dialectic. What can I say, other than that few philosophical positions survive contact with practicality.) Still, and for all that they were apparently a somewhat grudging addition on Pinsky’s part, Mindwheel‘s puzzles are mostly pretty good, managing to serve the themes with an emphasis on poetics, dialog, and symbolism rather than a bunch of mechanistic operations. Occasionally they’re more than pretty good, as in the case of the most intricate, rewarding, and personal puzzle of all: the completion of a sonnet using words gathered from the environment around you. The sonnet in question originated with the Renaissance poet Fulke Greville. The lines were, however, too long to fit on the 40-column screens used by many of Synapse’s customers, so Pinsky converted the poem from pentameter to tetrameter. The puzzle is brilliant because it so perfectly connects with the daily labors of the mind you’re exploring. You’re counting beats, looking at the rhyme scheme, seeking that word that fits mechanically and also just, well, fits. Pinsky, who labored always to find ways to make poetry relevant in people’s lives, was delighted when he saw a group of playtesting high-school kids “just trying to figure them [the sonnet and some other poetry-related puzzles] out because they’re having fun and want to do it.”

The Wheel

The central image of the Mindwheel itself is one that also appears in “The Figured Wheel,” a poem Pinsky published almost contemporaneously with the game. It’s another element that has continued to recur in Pinsky’s later work.

Imagine a wheel — a colossal, rotating wheel into which is drawn all of the images of a culture: every experience, every event, every object, every person’s mind and body. This wheel is a vortex which you must try to manipulate and understand.

It involves the idea of striving for control and mastery, and the world being so complicated that every time you strive you’re creating another system that becomes part of this big whirling thing which is everything everybody’s ever known or thought or dreamed up to amuse themselves. Jokes and technologies and mythologies and religions and roads and… just everything.

Such heady concepts aside, the question of what Mindwheel ultimately all means is a fraught one. There’s a telling moment near the end of the game where in order to progress you have to cold-bloodedly sacrifice a certain frog who’s been your loyal companion through most of the game. Trinity, Brian Moriarty’s masterpiece which we’ll be getting to in a future article, has a similar moment which is among its most moving and important, serving as a critique of the whole atomic doctrine of mutually assured destruction and the idea of sacrificing the few for the needs of the many which led to the atomic bombings of Hiroshima and Nagasaki. (Before you rush to comment, do know that the decision to drop those bombs is one with which I must unhappily agree.) But then Trinity is a work with some very clear messages to impart. In Mindwheel the sacrifice is played almost for laughs; the frog returns in the finale as a happy zombie.

Does this make Mindwheel a lesser work than Trinity? Well, it certainly takes itself less seriously, but we need not condemn it for that. There was a time when poets would compete to do their patrons proud by taking a well-known vignette out of the Bible or mythology and embellishing it over hundreds or thousands of lines of verse, adding layer after layer of pathos and sensuality and imaginative gilding, like a literary version of a guitar-shredding contest; see Shakespeare’s “Venus and Adonis” and “The Rape of Lucrece” for spectacular examples of the genre. There’s some of that same spirit to Mindwheel. Pinsky is having fun here. Poetry should be, whatever else it is, fun.

Pinsky was never more delighted by Mindwheel than when it managed to surprise him, which it did more often than you might expect thanks to the rather loosy-goosy and free-association-inclined BTZ parser.

I was playing the game with my fifteen-year-old son, and we got up a tree. There was a lizard at the base of the tree that would repeatedly kill us. I knew that it was random, but we were on a bad run. We also had our friend the frog with us in the tree. So we gave the disk to the frog and said, “Frog, go down and kill the lizard.” By God, he did it. And the message appeared that the lizard died spewing blood and pus. The creators of the game didn’t know what was going to happen.

One of his favorite anecdotes is that of the beautiful lady to which a friend typed, “You look like my mother.” “I will look the way you want me to” was her alleged reply. (Unfortunately, the published version of the game yields the far less satisfying “Okay, I’ll look.” The problem with a parser like Synapse’s is that it might deliver something unexpected and brilliant from time to time in response to some unusual input, but nine times out of ten it just delivers gibberish or takes your command as meaning something that you really, really didn’t want to do.)

The period of Mindwheel‘s development was a happy and fulfilling one for Pinsky, but a difficult one for Synapse. In addition to the Electronic Novel line, the company had just launched another bold new initiative: to develop a line of business applications — SynFile, SynCalc, and SynTrend — to be marketed and distributed by Atari themselves. In July of 1984, however, Jack Tramiel bought Atari (a story we’ll be getting to in detail in a future article), and promptly told Synapse that he didn’t want their applications and didn’t intend to pay for them. Synapse, who had invested heavily in the work, became just the latest of a long line of Tramiel suppliers to be double-crossed and financially destroyed by the old business warrior. Meanwhile the rest of the Atari 8-bit market, still Synapse’s bread and butter, was in increasingly dire straits, being pummeled by the Commodore 64. Flying high barely six months before, Synapse suddenly faced bankruptcy before they could release a single one of the Electronic Novels that they hoped would stake out for them a new place in the industry. A savior appeared in the form of Brøderbund, who agreed to buy Synapse and take them under their wing in October. The Carlstons knew and liked Wolosenko and the rest of the Synapse folks, and wanted their expertise in action-game programming as well as the promising Electronic Novel line; it was still the era of bookware, after all, with Infocom’s The Hitchhiker’s Guide to the Galaxy the talk of the industry.

Mindwheel

The release date for Mindwheel slipped a bit amidst all the chaos, from the planned late 1984 to February of 1985. It generated the last big wave of the already dying bookware storm, with some images that can seem as surreal today as anything in the game proper: Pinsky blinking amidst the strobe lights at the Winter Consumer Electronics Show; Pinsky waxing philosophical in those noted literary magazines Compute!’s Gazette and A.N.A.L.O.G. (“The #1 Magazine for Atari Computer Owners!”). It’s questionable, though, to what degree the press buzz translated into sales, although Mindwheel undoubtedly became by far the best selling of the Electronic Novel line as a whole — not, alas, a high bar to clear.

I’ve long since made my peace with the fact that traditional parser-driven interactive fiction is, due to various irresistible forces, just an intriguing blip in the histories of literature and/or gaming (take your pick) that will quite likely die entirely with my generation. In general, I think that’s fine; Shakespeare is still as beautiful and relevant as ever despite the fact that modern theater has as little in common with the Elizabethan stage as does textual interactive fiction with a modern graphical game. Certainly elaborate counter-factuals, whether in life or in history, are seldom all that productive. Yet it’s hard not to feel just a little bit wistful reading those old interviews with Pinsky where he throws out ideas of what he’d like to try in his next game whenever someone “asks me to do another of these”; wistful for that world, widely accepted as inevitable for a brief instant in the mid-1980s, when major writers — good writers — would be routinely asked whether their next work would be interactive or non-interactive.

Ah, well, at least we have Mindwheel. The Apple II version I’m providing for download here is probably your best bet, being very playable and also quite easy to get up and running in any number of slick Apple II emulators like AppleWin; be sure to answer “yes” to 80 columns and to turn on faster disk-drive emulation. It’s worth the effort. (Edit: Steve Hales has now made a web page that hosts Mindwheel for play online in a browser. You unfortunately can’t save, but this is by far the easiest way to get a taste of the experience.) Whatever the reasons for Mindwheel‘s academic reputation today, it’s definitely not undeserved.

(This article draws heavily from Jason Scott’s interview with the ever thoughtful and articulate Robert Pinsky for Get Lamp. Magazine sources this time were: A.N.T.I.C. of April 1983, November 1984, and July 1985; Ahoy! of February 1984; Compute!’s Gazette of June 1985; Analog of December 1985; QuestBusters of March 1985. There’s an interesting discussion of Mindwheel in Nick Montfort’s Twisty Little Passages and also in an article Pinsky himself wrote for the Autumn 1987 New England Review. Finally, Steve Hales’s brief recollections of working with Pinsky can be found in two places online.)

 
 

Tags: , , ,

From Congo to Amazon

There are new ways of presenting information other than the traditional ways in which the reader or viewer is required to be passive. A few years ago, I realized that I didn’t know about these things, and that I’d better find out about them. The only way I could learn was to actually go and do one. So I said, “Well, I’ll just make a game and then I’ll learn.” And I certainly did.

— Michael Crichton, 1984

Anyone who had been reading Michael Crichton’s novels prior to the founding of the Telarium brand had to know of his interest in computers. The plot of 1972’s The Terminal Man, of a man who has a computer implanted in his brain, is the sort of thing that would become commonplace in science fiction only with the rise of cyberpunk more than a decade later. And of course computers are also all over 1980’s Congo; indeed, they’re the only reason the heroes are out there in the jungle in the first place. Crichton’s personal history with computers also stretches back surprisingly far. Always an inveterate gadget freak, he bought his first computer-like machine in the form of an Olivetti word processor almost as soon as his earnings from his first hit novel, The Andromeda Strain, made it possible. He wrote his books for years on the Olivetti. When the trinity of 1977 arrived, he quickly jumped aboard the PC revolution with an Apple II, first of a stable that within a few years would also include Commodores, Radio Shacks, and IBMs.

Never shy about sharing his interests in print, Crichton became a semi-regular contributor to Creative Computing magazine, who were thrilled to have a byline of his prominence under any terms. Thus they gave him free rein to opine in the abstract:

I would argue that it [computer technology] is a force of human evolution, opening new possibilities for our minds, simultaneously freeing us from drudgery while presenting us with a parody of our own rational sides. Computers actually show us both the benefits and the limits of rationality with wonderful precision. What could be more rational than that pedantic little box that keeps saying SYNTAX ERROR over and over? And what does our frustration suggest to us, in terms of other things to do and other ways to be?

But Crichton was more than the mere dabbler that poeticisms like the above might suggest. He took the time to learn how to program his toys, publishing fairly intricate program listings in BASIC for applications such as casting the I Ching (a byproduct of his seldom remarked interest in mysticism; see his nonfiction memoir Travels, which might just be the most interesting thing he ever wrote); identifying users based on their typing characteristics (inspired by his recent short story “Mousetrap”); and creating onscreen art mirroring that of abstract painter Josef Albers (Crichton’s interest in and patronship of the visual arts also tends to go unremarked). In 1983 he published the book Electronic Life: How to Think About Computers, a breezy introduction for the layman which nevertheless shared some real wisdom on topics such as the absurdity of the drive for “computer literacy” which insisted that every schoolchild in the country needed to know how to program in BASIC to have a prayer of success in later life. It also offered a spirited defense of computer as tools for entertainment and creativity as well as business and other practical matters.

Which isn’t to say that he didn’t find plenty of such practical applications for his computers. During this part of his life Crichton was immersed in planning for a movie called Runaway, which was to star Tom Selleck and Gene Simmons of Magnum P.I. and Kiss fame respectively. He hoped it would be one of the major blockbusters of 1984, although it would ultimately be overshadowed by a glut of other high-profile science-fiction films that year (The Terminator, Star Trek III, 2010). He hired a team to create a financial-modeling packaging which he claimed would allow a prospective filmmaker to input a bunch of parameters and have a shooting budget for any movie in “about a minute.” It was soon circulating amongst his peers in Hollywood.

Thus when the folks at Telarium started thinking about authors who might be interested in licensing their books and maybe even working with them on the resulting adaptations, Crichton was a natural. Seth Godin approached him in late 1983. He returned with extraordinary news: not only was Crichton interested, but he already had a largely completed game for them, based on his most recent novel, Congo.

Crichton had first started thinking he might like to write a game as long as two years before Godin’s inquiry. He’d grown frustrated with the limitations of the adventure games he’d played, limitations which seemed to spring not just from the technology but also from the lack of dramatic chops of their programmers.

I simply didn’t understand the mentality that informed them. It was not until I began programming myself that I realized it was a debugger’s mentality. They could make you sit outside a door until you said exactly the right words. Sometimes you had to say, “I quit,” and then it would let you through.

Well, that’s life in the programming world. It’s not life in any other world. It’s not an accepted dramatic convention in any other arena of entertainment. It’s something you learn to do when you’re trying to make the computer work.

Here’s what I found out early on: you can’t have extremely varied choices that don’t seem to matter. I can go north, south, east, or west, and who cares? You can only do that for a while, and then if you don’t start to have an expectation of what will happen, you’ll stop playing the game. You’d better get right going and you’d better start to have something happen.

If I play a game for a half-hour and it doesn’t make any sense to me, I’ll just quit and never go back. Say I’m locked in this house and I don’t know what the point of the house is and why I can’t get out and there’s no sort of hint to me about the mentality that would assist me in getting out — I don’t know. I could say “Shazam!” or I could burn the house down or — give me a break. I just stop.

Crichton started to sketch out his own adventure game based on Congo, whose simple quest plot structure made it a relatively good choice for conversion to the new format. Realizing that his programming skills weren’t up to the task of implementing his ideas, he hired programmer Stephen Warady to write the game in Apple II assembly language. The little team was eventually completed by David Durand, an artist who normally worked in film graphics. The game as it evolved was as much a mixed-media experience as text adventure, incorporating illustrations, simple action games, and other occasional graphical interludes that almost qualify as cut scenes, perfectly befitting this most cinematic of writers (and, not incidentally, making the game a perfect match with Telarium’s other games once they finally came calling). Crichton would sometimes program these sequences himself in BASIC, then turn them over to Warady to redo in much faster assembly language. Given Crichton’s other commitments, work on Congo the game proceeded in fits and starts for some eighteen months. They were just getting to the point of thinking about a publisher when Godin arrived to relieve them of that stress.

When Spinnaker started their due diligence on the deal, however, a huge problem quickly presented itself: Crichton, as was typical for him by this time, had already sold the media rights to Congo to Hollywood. (After they languished there for many years, the success of the Jurassic Park film would finally prompt Paramount Pictures to pick them up and make a Congo movie at last in 1995. Opinions are divided over whether that movie was just bad or so cosmically bad that it became good again.) Those rights unfortunately included all adaptations, including computer games, something the usually business-savvy Crichton had totally failed to realize. Spinnaker may have been a big wheel in home computers, but they didn’t have much clout in Hollywood. So, they came up with another solution: they excised the specifics of the novel from the game, leaving just the plot framework. The Congo became the Amazon; Amy the signing gorilla became Paco the talking parrot; Earth Resources Technology Services became National Satellite Resources Technology; a diamond mine became an emerald mine; African cannibals and roving, massacring army troops became South American cannibals and roving, massacring army troops. It may not have said much for Crichton and Spinnaker’s appreciation for cultural diversity, but it solved their legal problems.

Amazon was written for the Apple II in native assembly language. Spinnaker, however, took advantage of the rare luxury of time — the game was in an almost completed state when Crichton signed in late 1983, fully one year before the Telarium line’s launch — to turn it over to Byron Preiss Video Productions to make a version in SAL for the all-important Commodore 64 platform. The result wasn’t quite as nice an experience as the original, but it was acceptable. And it was certainly a wise move: Amazon became by all indications the most successful of all the Telarium games. Some reports have it selling as many as 100,000 copies, very good numbers for a member of a line whose overall commercial performance was quite disappointing. The majority of those were most likely the Commodore 64 version, if sales patterns for Amazon matched those for the industry as a whole.

I do want to talk about Amazon in more detail; it’s an historically important game thanks if nothing else to Crichton’s involvement and also a very interesting one, with some genuinely new approaches. But we’ll save that discussion for next time. In the meantime, feel free to download the Apple II version from here if you’d like to get a head start. Note that disk 3 is the boot disk.

(All of the references I listed in my first article on bookware still apply. Useful interviews with Crichton appeared in the February 1985 Creative Computing and February 1985 Compute!. Other articles and programs by Crichton appeared in Creative Computing‘s March 1983, June 1984, and November 1984 issues.)

 
5 Comments

Posted by on October 11, 2013 in Digital Antiquaria, Interactive Fiction

 

Tags: , , , ,

A Computer for Every Home?

On January 13, 1984, Commodore held their first board of directors meeting of the year. It should have been a relaxed, happy occasion, a time to make plans for the new year but also one last chance to reflect on a stellar 1983, a year in which they had sold more computers than any two of their rivals combined and truly joined the big boys of corporate America by reaching a billion dollars in gross sales. During the last quarter of 1983 alone they had ridden a spectacular Christmas buying season to more than $50 million in profits. Commodore had won the Home Computer Wars convincingly, driving rival Texas Instruments to unconditional surrender. To make the triumph even sweeter, rival Apple had publicly announced the goal of selling a billion dollars worth of their own computers that year, only to fall just short thanks to the failure of the Lisa. Atari, meanwhile, had imploded in the wake of the videogame crash, losing more than $500 million and laying off more than 2000 workers. Commodore had just the previous summer moved into a sprawling new 585,000 square-foot, two story headquarters in West Chester, Pennsylvania that befitted their new stature; some of the manufacturing spaces and warehouses in the place were so large that Commodore veterans insist today that they had their own weather. Yes, it should have been a happy time at Commodore. But instead there was doubt and trepidation in the air as executives filed into the boardroom on that Friday the 13th.

A day or two before, Jack Tramiel had had a heated argument with Irving Gould, Commodore’s largest shareholder and the man who controlled his purse strings, in the company’s private suite above their exhibit at the 1984 Winter Consumer Electronics Show. That in itself wasn’t unusual; these two corrupt old bulldogs had had an adversarial relationship for almost two decades now. This time, however, observers remarked that Gould was shouting as much as Tramiel. That was unusual; Gould normally sat impassively until Tramiel exhausted himself, then quietly told him which demands he was and wasn’t willing to meet. When Tramiel stormed red-faced out of the meeting and sped away in the new sports car he’d just gotten for his 55th birthday, it was clear that this was not just the usual squabbling. Now observers outside the board-of-directors meeting, which was being chaired as usual by Gould, saw him depart halfway through in a similar huff. He would never darken Commodore’s doors again.

No one who was inside that boardroom has ever revealed exactly what transpired there. With Gould and Tramiel both now dead and the other former board members either dead or aged, it’s unlikely that anyone ever will. On the face of it, it seems hard to imagine. What could cause these two men who had managed to stay together through the toughest of times, during which Commodore had more than once teetered on the edge of bankruptcy, to irrevocably split now, when their company had just enjoyed the best year in its history? We can only speculate.

Commodore had ceased truly being Tramiel’s company in 1966, when Gould swooped in to bail him out from the Financial Acceptance Scandal of the previous year. Tramiel, however, never quite got the memo. He continued to run the company like a sole proprietor to whatever extent that Gould would let him. Tramiel micro-managed to an astonishing degree. He did not, for instance, believe in budgets, considering them a “license to steal,” a guarantee that the responsible manager, knowing he had X million available, would always spend at least X million. Instead he demanded that every expenditure of greater than $1000 be approved personally by him, with the result that much of the company ground to a halt any time he took a holiday. Even as Tramiel enjoyed his best year ever in business, Gould and others in the financial community were beginning to ask the very reasonable question of whether this was really a sustainable way to run a billion-dollar company.

Still, the specific cause of Tramiel’s departure seems likely to have involved his sons. Tramiel valued family above all else, and, like a typical small businessman, dreamed of leaving “his” company to his three sons. Whether by coincidence or something else, it even worked out that each son had an area of expertise that would be critical to running a company like Commodore. Sam, the eldest, had trained in business management at York University, while Gary, the youngest, was a financial analyst with a degree from Manlow Park College and experience as a stockbroker at Merrill Lynch. Leonard, the middle child, was the intellectual and the gearhead; he was finishing a PhD in astrophysics at Columbia, and was by all accounts quite an accomplished hardware and software hacker. Sam and Gary already worked for Commodore, while Leonard planned to start as soon as he finished his PhD in a few more months. Various witnesses have claimed that Tramiel the elder now wished to begin more actively grooming this three-headed monster to take more and more of his responsibilities, and someday to take his place. Feeling nothing good could come out of such blatant nepotism inside a publicly traded corporation that was trying to put its somewhat seedy history behind it, Gould refused absolutely to countenance such a plan. Given Tramiel’s devotion to his family and his attitude toward Commodore as his personal fiefdom, it does make a degree of sense that this particular rejection might have been more than he could stomach.

In any case, Tramiel was gone, and Gould, who had made his fortune in the unglamorous world of warehousing and shipping and was reportedly both a bit jealous of Tramiel’s high profile in an exciting, emerging industry and a bit embarrassed by his gruff, untutored ways, didn’t seem particularly distraught about it. The man he brought in to replace him could hardly have been more different. Marshall F. Smith was a blandly feckless veteran of boardrooms and country clubs who had spent his career in the steel industry. It’s hard to grasp just why Gould latched onto Smith of all people. Perhaps he was following the lead of Apple, who the previous year had brought in their own leader from outside the computer industry, John Sculley. Sculley, however, understood consumer marketing, having cut his teeth at Pepsi, where he was the mastermind behind the Pepsi Challenge, still one of the most iconic and effective advertising campaigns in the long history of the Cola Wars. The anonymous world of Big Steel offered no comparable experience. Smith’s appointment was the first of a long string of well-nigh incomprehensible mistakes Gould would make over the next decade. Engineers that were initially thrilled to have proper funding and actual budgets at last were soon watching with growing concern as Smith puttered about with a growing management bureaucracy and let the company drift without direction. Many were soon muttering that it’s often better to make a decision — even the wrong decision — than to just let things hang. Whatever else you could say about Jack Tramiel, he never lacked the courage of his convictions.

Commodore’s first significant new models, which reached stores at last in October of 1984, more than two years after the Commodore 64, hardly did much to inspire confidence in the new regime. Nothing about the Commodore 16 and the Plus/4 made any sense at all. The 16 was an ultra-low-end model with just 16 K of memory, long after the time for such a beast had passed. The trend in even inexpensive 8-bit computers was onward, toward the next magic number of 128 K, not backward to the late 1970s.

The Commodore Plus/4

The Commodore Plus/4

As for the Plus/4, which like the 64 was built around a variant of the 6502 CPU and had the same 64 K of memory but was nevertheless incompatible… well, it was the proverbial riddle wrapped in a mystery inside an enigma. It was billed as a more “serious” machine than the 64, a computer for “home and business applications” rather than gaming, and priced to match at about $300, more than $100 more than the 64. It featured four applications built right into its ROM (thus the machine’s name): a file manager, a word processor, a spreadsheet, and a graphing program. All were pathetically sub-rate even by the standards of Commodore 64 applications, hardly the gold standard in business computing. The Plus/4 lacked the 64’s sprites and SID sound chip, which made a degree of sense; for a dismaying number of years yet a lack of audiovisual capability would be taken as a signifier of serious intent in computing. But why did it offer more colors, 128 as opposed to the 64’s 16? And as an allegedly more serious computer, why didn’t it offer the 80-column display absolutely essential for comfortable word processing and other typical productive tasks? And as a more serious (and expensive) computer, why did it have a rubbery keyboard almost as awful to type on as the IBM PCjr’s Chiclet model? And would all those serious, more productive buyers really be doing a lot of BASIC programming? If not, why was one of the main selling points a much better BASIC than the bare-bones edition found in the 64? Info, a magazine that would soon built a reputation for saying the things about Commodore’s bizarre decisions that nobody else would, gave the Plus/4 a withering review:

The biggest problem with the Plus/4 is the fundamental concept: an 8-bit, 64 K, 40-column desktop personal computer. Commodore already makes the best 8-bit, 64 K, 40-column desktop personal computer you can buy, with literally thousands of products supporting it! Why should consumers want a “new” machine with no significant advances, several new limitations, and virtually no third-party product support? And why would a company with no competition in the under-$500 category bring out an incompatible [machine] that can’t compete with anybody’s machine except their own? It just doesn’t compute!

Info ran a wonderfully snarky contest in the same issue, giving away the Plus/4 they’d just reviewed. After all, it was “sure to become a collector’s item!” Even the more staid Compute!’s Gazette managed to flummox a poor Commodore representative with a single question: “Why buy a 264 [a pre-release name for the Plus/4] instead of a 64 that has a word processor and, say, a Simon’s BASIC? It would be the equivalent of the 264 for less money.” Commodore happily claimed that the Plus/4 had enough utility built right in for the “average small business” (maybe they meant one of the vast majority that fail within a year or two anyway), but in reality it seemed like it had been cobbled together from spare parts that Commodore happened to have lying around. In fact, that’s not far from what happened — and Tramiel actually bears as much responsibility for the whole fiasco as the clueless Marshall Smith.

Tramiel, you’ll remember, had driven away the heart of his engineering team in his usual hail of recriminations and lawsuits shortly after they had created the 64 for him. He did eventually find more talented young engineers, notably Bil Herd and Dave Haynie. (Commodore always preferred their engineers young and inexperienced because that way they didn’t have to pay them much — a strategy that sometimes backfired but was sometimes perversely successful, netting them brilliant, unconventional minds who would have been overlooked by other companies.) When Herd arrived at Commodore in early 1983, engineers had been tinkering for some time with a new video and audio chip, the TED (short for Text Display). With engineering straitened as ever by Tramiel’s aversion to spending money, the 23-year-old Herd soon found himself leading a project to make the TED the heart of a new computer, despite the fact that it was in some ways a step back, lacking the sprites of the 64’s VIC chip and the marvelous sound capabilities of its SID chip. Marketing came up with the dubious idea of including applications in ROM, which by all accounts delighted Tramiel.

Tramiel, who at some fundamental level still thought of the computers he now sold like the calculators he once had, failed to grasp that the whole value of a computer is the ability to do lots of different things with it, to have lots and lots of options its designers may never have anticipated, all through the magic of software. Locking applications into ROM, making them impossible to replace or update, was kind of missing the point of building a computer in the first place. Failing to understand that a computer is only as good to consumers as the quality and variety of its available software, Tramiel also saw no problem with making the new machine incompatible with the 64. It seems to have come as a complete surprise to him when the machine was announced at that fateful Winter CES and everyone’s first question was whether they could use it to run the Commodore 64 software they already had.

After Tramiel’s abrupt departure, Commodore pushed ahead with the 16 and Plus/4 in the muddled way that would be their wont for the rest of the company’s life, despite a skeptical press and utterly indifferent consumers. It all made so little sense that some have darkly hinted of a conspiracy hatched by Tramiel amongst his remaining loyalists at Commodore to get the company to waste resources, time, and credibility on these obvious losers. (Tramiel recruited a substantial number of said loyalists to join him after he purchased Atari and got back in the home-computer game — exactly the sort of thing for which he so often sued others. But that’s a story for a later article.) Incredibly given the cobbled nature of the machine, it took nine more months after that CES to finally get the 16 and Plus/4 into production and watch them duly flop. Again, such a glacial pace would prove to be a consistent trait of the post-Tramiel Commodore.

By the time they did appear at last, the poor, benighted 16 and Plus/4 had more working against them than just their own failings, considerable as those may have been. The year as a whole was marked by failures in the home-computer segment of the market. Atari was reeling. Coleco was taking massive losses on their tardy entry into the home-computing field, the Adam. And of course I’ve already told you about the IBM PCjr.

Even Apple, who had enjoyed a splashy, successful launch of their new higher-end Macintosh (another story for a later date), had a somewhat disappointing new model amongst their bread-and-butter Apple II line. The “c” in the the Apple IIc’s name stood for “compact,” and it was indeed a much smaller version of Steve Wozniak’s old evergreen design. Like the Macintosh, it was a closed system designed for the end user who just wanted to get work (or play) done, not for the hackers who had adored the earlier editions of the II with their big cases and heaps of inviting expansion slots. The idea was that you would get everything you, the ordinary user, really needed built right in: all of the fundamental interface cards, a disk drive, a full 128 K of memory (as much as the Macintosh), etc. All you would really need to add to have a nice home-office setup was a monitor and a printer.

The Apple IIc

The Apple IIc

But the IIc was not envisioned just as a more practical machine: as the only II model after the first with which Steve Jobs played an important role, it evinced all of his famous obsession with design. Indeed, much of the external look and sensibility that we associate with Apple today begins as much here as with the just slightly older — and, truth be told, just slightly clunkier-looking — first Macintosh model. The Apple IIc was the first product of what would turn into a longstanding partnership with the German firm Frog Design. It marks the debut of what Apple referred to as the “Snow White” design language — slim, modern, sleek, and, yes, white. Everything about the IIc, including the packaging and the glossy manuals inside, oozed the same chic elegance.

Apple introduced the IIc at a lavish party and exhibition in San Francisco’s Moscone Center in April of 1984, just three months after a similar shindig to launch the Macintosh. The name was chosen to mollify restless Apple II owners who feared — rightly so, as it would turn out; even at “Apple II Forever” Jobs made time for a presentation on “The First 100 Days of Macintosh” — that Sculley, Jobs, and their associates had little further interest in them. Geniuses that they have always been for burnishing their own myths, Apple built a museum right there in the conference center, its centerpiece a replica of the garage where it had all begun. The IIc unveiling itself was an audiovisual extravaganza featuring three huge projection screens for the music video Apple had commissioned for the occasion. The most dramatic and theatrical moment came when Sculley held the tiny machine above him onstage for the first time. As the crowd strained to see, he asked if they’d like a closer look. Then the house lights suddenly came up and every fifth person in the audience stood up with an Apple IIc of her own to show and pass around.

Apple confidently predicted that they would soon be selling 100,000 IIcs every month on the strength of the launch buzz and a $15 million advertising campaign. In actuality the machine averaged just 100,000 sales per year over its four years in Apple’s product catalogs. The old, ugly IIe outsold its fairer sibling handily. This left Apple in a huge bind for a while, for they had all but stopped production of the IIe in anticipation of the IIc’s success while wildly overproducing IIcs for a rush that never materialized. Thus for some time stores were glutted with the IIcs that consumers didn’t want and couldn’t get their hands on the IIes that they did. (It’s interesting to consider that the PCjr almost certainly sold more units than the IIc, which has never been tarred with the label of outright flop, during each machine’s first year on the market. Narratives can be funny things.)

It remains even today somewhat unclear why the world never embraced the IIc as it had the three Apple II models that preceded it. There’s some evidence to suggest that consumers, not yet conditioned to expect each new generation of computing technology to be both smaller and more powerful than the previous, took the IIc’s small size to be a sign that it was not as serious or powerful as the IIe. Apple was actually aware of this danger before the IIc debuted. Thus the advertising campaign worked hard to explain that the IIc was more powerful than its size would imply, with the tagline, “Announcing a technological breakthrough of incredible proportions.” Yet it’s doubtful whether this message really got through. In addition, the IIc was, like the PCjr, an expensive proposition for the home-computer buyer: almost $1300, $300 more than a basic IIe. For that price you got twice the memory of the IIe as well as various other IIe add-on options built right in, but the value of all this may have been difficult for the novice buyer, the IIc’s main target, to grasp. She may just have seen that she was being asked to pay more for a smaller and thus presumably less capable machine, and gone with the bigger, more serious-looking IIe (if anything from Apple).

Then again, maybe the IIc was just born under a bad sign. As I’ve already noted, nobody was having much luck with their new home computers in 1984, almost regardless of their individual strengths and weaknesses.

But why was this trend so universal? That’s what people inside the industry and computer evangelists outside it were asking themselves with increasing urgency as the year wore on. As 1984 drew toward a close, the inertia began to affect even the most established warhorses, the Commodore 64 and the Apple IIe. Both Commodore and Apple posted disappointing Christmas numbers, down at least 20% from the year before, and poor Commodore, now effectively a one-product company completely reliant on continuing sales of the 64, sank back well below that magic billion-dollar threshold again. In the grand scheme of things the Commodore 64 was still a ridiculously successful machine, by far the bestselling computer in the world and the preeminent gaming platform of its era. Yet there increasingly seemed to be something wrong with the home-computer revolution as a whole.

Commodore 64 startup screen

The fact was that a backlash had been steadily building almost from the moment that the spectacular Christmas 1983 buying season had ended. Consumers had begun to say, and not without considerable justification, that home computers promised far more than they delivered. Watching all those bright, happy faces in television and print advertising, people had bought computers expecting them to do the things that the computers there were doing. As Commodore’s advertising put it, “If you’re not pleased with what’s on your TV set tonight, simply turn on your Commodore 64.” Yet what did you get when you turned on your 64 — after you figured out how to connect it to your TV in the first place, that is? No bright fun, just something about 38,911 somethings, a READY prompt, and a cryptically blinking cursor. Everything about television was easy; everything about computers was hard. Computers had been sold to consumers like any other piece of consumer electronics, but they were not like any other piece of consumer electronics. For the vast majority of people — those who had no intrinsic fascination with the technology itself, who merely wanted to do the sorts of things those families on TV were doing — they were stubborn, frustrating, well-nigh intractable things. Ordinary consumer were dutifully buying computers, but computers were at some fundamental level not yet ready for ordinary consumers.

The computer industry was still unable to really answer the question which had dogged and thwarted it ever since Radio Shack had run the first ads showing a happy housewife sorting her recipes on a TRS-80 perched on the kitchen table: why do I, the ordinary man or woman with children to feed and a job to keep, need one? Commodore had cemented the industry’s go-to rhetoric with the help of William Shatner in their VIC-20 advertising campaign that first carved out a real market segment for home computers. You needed a computer for productivity tasks and for your children’s future, “Johnny can’t read BASIC” having replaced “Johnny can’t read” as the marker of a neglectful parent. Entertainment was relegated to an asterisk at the end: “Plays great games too!”

Yet, honestly, how productive could you really be with even the Commodore 64, much less the 5 K VIC-20? Some people did manage to do productive things with their 64s, but most of those who did forgot or decided not to ask themselves a simple question: is doing this on the computer really easier than the alternative? The answer was almost always no. Hobbyists chose to do things on the computer because it was cool, not because it was practical. Never mind if it took far more effort to keep one’s address book on the Commodore 64, what with its slow disk drive and quirky, unrefined software, than it would have to just have a paper card file. Never mind if it was much riskier as well, prone to deletion by an errant key swipe or a misbehaving disk drive. It was cooler, and that was all that mattered — to a technology buff. Most other people found it easier to address their Christmas cards by hand than to try to feed envelopes through a tractor-fed dot-matrix printer that made enough noise to wake the neighbors.

Perhaps the one possible compelling productive use of a machine like the Commodore 64 in the home was as a word processor. Kids today can’t imagine how students once despaired when their teachers told them that a report had to be typed back in the era of typewriters, can’t conceive how difficult it was to get anything on paper in typewritten form when every mistake made by untutored fingers meant trying to decide between pulling out the Liquid Paper or just starting all over again. But even word processing on the 64 was made so painful by the 40-column screen and manifold other compromises that there was room to debate whether the cure was worse than the disease. Specialized hardware-based word processors became hugely popular during this era for just this reason. These single-function, all-in-one devices were much more pleasant to use than a Commodore 64 equipped with a $30 program, and cheaper than buying a whole computer system, especially if you went with a higher priced and thus more productively useful model like the Apple II.

The idea that every child in America needed to learn to program, lest she be left behind to flip burgers while her friends had brilliant careers, was also absurd on the face of it. It was akin to declaring during the days of the Model T that every citizen needed to learn to strip down and rebuild one of these newfangled automobiles. Basic computer literacy was important (and remains so today); BASIC literacy was not. What a child really needed to know could largely be taught in school. Parents needn’t have fretted if Junior preferred reading, listening to music, playing sports, or practicing origami to learning the vagaries of PEEKs and POKEs in BASIC 2.0. There would be time enough for computing when computing and Junior had both grown up a bit.

So, everything had changed yet nothing had changed since the halcyon days of the trinity of 1977. Computers were transforming the face and in some cases the very nature of business, yet there remained just two compelling reasons to have one in the home: 1) for the sheer joy of hacking or 2) for playing games. Lots more computers were now being used for the latter than the former, thanks to the vastly more and vastly better games that were now available. But for many folks games just weren’t a compelling enough reason to own one. The Puritan ethic that makes people feel guilty of their pleasures was as strong in America then as it remains today. It certainly didn’t help that the media had been filled for several years now with hand-wringing about the effect videogames were having on the psyches of youngsters. (This prompted many computer publishers of this period to work hard, albeit likely with limited success, to label their computer games as something different, something more cerebral and rewarding and even, dare we say it, educational than their simplistic videogame cousins.)

But, perhaps most of all, computers still remained quite expensive when you really dug into everything you needed for a workable system. Yes, you could get a Commodore 64 for less than $200 by the Christmas of 1983. But then you needed a disk drive ($220) if you wanted to do, well, much of anything with it; a monitor ($220) if you wanted a nice picture and didn’t want to tie up the family television all the time; a printer ($290) for word processing, if you wanted to take that fraught plunge; a modem ($60) to go online. It didn’t take long until you were approaching four digits, and that’s without even entering into a discussion of software. There was thus a certain note of false advertising in that sub-$200 Commodore 64. And because these machines were being sold through mass merchandisers rather than dealers, there was no one who really knew better, who could help buyers to put a proper system together at the point of sale. Consumers, conditioned by pretty much everything else that was sold to them not to expect the 64 on its own to be pretty much useless, were often baffled and frustrated when they realized they had bought an expensive doorstop. Many of the computers sold during that Christmas of 1983 were turned on a few times only, then consigned to the back of the closet or attic to gather dust. The bad taste they put in many people’s mouths would take years to go away. Meanwhile the more complete, useful machines, like the Apple IIc and the PCjr, were still more expensive than a complete Commodore 64 system — and the games on them weren’t as good to boot. Hackers and passionate gamers (or, perhaps more commonly, their generous parents) were willing to pay the price. Curious novices largely were not. Faced with no really good all-purpose options, many — most, actually — soon decided home computers just weren’t worth it. The real home-computer revolution, as it turned out, was still almost ten years away. About 15% of American homes had computers — at least ostensibly; many of them were, as just mentioned, buried in closets — by January 1, 1985, but that figure would rise with agonizing slowness for the rest of the decade. People could still live perfectly happy lives fully plugged into the cultural discourse around them and raise healthy, productive children in the process without owning a computer. Only much later, with the arrival of the World Wide Web and computers equipped with more intuitive graphical user interfaces for accessing it, would that change.

Which is not to say that the software and information industries that had exploded in and around the home-computer revolution during 1982 and 1983 died just like that. Many of its prominent members, however, did, as the financial gambles they had taken in anticipation of the home-computer revolution came back to haunt them. We’ve just seen how Sierra nearly went under during this period. Muse Software and Scott Adams’s Adventure International, to name two other old friends from this blog, weren’t so lucky; both folded in 1985. Electronic Arts survived, but steered their rhetoric and choice of titles somewhat away from Trip Hawkins’s original vision of “consumer software” toward titles tilted more toward the hardcore, in proven hardcore genres like the CRPG and the adventure game.

Magazines were even harder hit. By early 1984 there were more than 300 professionally published computing periodicals of one sort or another, many of them just founded during the boom of the previous year. Well over half of these died during 1984 and 1985. Mixed in with the dead Johnny-come-latelys were some cherished veteran voices, among them pioneers Creative Computing (1974), SoftSide (1978), and Softalk (1980). The latter’s demise, after exactly four years and 48 issues of sometimes superb people-focused journalism, came as a particular blow to the Apple II community; Apple historian Steven Weyhrich names this moment as nothing less than the end of the “golden age” of the Apple II. Those magazines that survived often did so in dramatically shrunken form. Compute!, for instance, went from 392 pages in December of 1983 to 160 ten months later.

Yet it wasn’t all doom and gloom. Paradoxically, some software publishers still did quite well. Infocom, for example, had the best single year in their history in 1984 in terms of unit sales, selling almost 750,000 games. It seemed that, with more options than ever before, software buyers were becoming much more discerning. Those publishers like Infocom who could offer them fresh, quality products showing a distinctive sensibility could do very well. Those who could not, like Adventure International with their tired old two-word parsers and simplistic engines, suffered the consequences. That real or implied asterisk (“Plays great games too!”) at the end of the advertising copy remained the great guilty secret of the remaining home-computer industry, the real reason computers were in homes at all. Thankfully, the best games were getting ever more complex and compelling; otherwise the industry may have been in even more trouble than it actually was.

Indeed, with a staggering number of machines already out there and heaps still to be sold for years to come, the golden age for Commodore 64 users was just beginning. This year of chaos and uncertainty was the year that the 64 really came into its own as a games machine, as programmers came to understand how to make it sing. Companies who found these keyboard maestros would be able to make millions from them. The home-computer revolution may not have quite panned out as anticipated and the parent company may have looked increasingly clueless, but for gamers the Commodore 64 stood alone with its combination of audiovisual capability, its large and ever growing catalog of games, and its low price. What with game consoles effectively dead in the wake of Atari’s crash and burn, all the action was right here.

In that spirit, we’ll look next time at the strange transformation that led one of our stodgiest old friends from earlier articles to become the hip purveyor of some of the slickest games that would ever grace the 64.

(The indispensable resources on Commodore’s history remains Brian Bagnall’s On the Edge and its revised edition, Commodore: A Company on the Edge. Frank Rose’s West of Eden is the best chronicle I know of this period of Apple’s history. The editorial pages and columnists in Compute! and Compute!’s Gazette provided a great unfolding account of a chaotic year in home computing as it happened. Particular props must go to Fred D’Ignazio for pointing out all of the problems with the standard rhetoric of the home-computer revolution in Compute!‘s May 1984 issue — but he does lose points for naming the PCjr as the answer to all these woes in the next issue.)

 

Tags: ,